CN105631828A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN105631828A
CN105631828A CN201511009631.0A CN201511009631A CN105631828A CN 105631828 A CN105631828 A CN 105631828A CN 201511009631 A CN201511009631 A CN 201511009631A CN 105631828 A CN105631828 A CN 105631828A
Authority
CN
China
Prior art keywords
frame
image
pixel
registration
reference picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201511009631.0A
Other languages
Chinese (zh)
Inventor
张一帆
邵晓鹏
王海馨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Xidian University
Original Assignee
Huawei Technologies Co Ltd
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd, Xidian University filed Critical Huawei Technologies Co Ltd
Priority to CN201511009631.0A priority Critical patent/CN105631828A/en
Publication of CN105631828A publication Critical patent/CN105631828A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses an image processing method and device, relates to the field of image processing, and aims at enhancing definition of an acquired image. The method comprises the steps that N frames of blurred images of a target scene are acquired, N is greater than or equal to 2, and N is an integer; each frame of blurred image of the N frames of blurred images is preprocessed so that N frames of first images are obtained, and one frame of blurred image is corresponding to one frame of first image; at least three groups of feature points of the N frames of first images are extracted, each group of feature points include N feature points and one feature point of each group of feature points is corresponding one frame of first image; the frames of images under registration are registered according to the feature points of each frame of image under registration in the at least three groups of feature points and the feature points of a reference image in the at least three groups of feature points with the reference image acting as the benchmark, the reference image is any one frame of first image of the N frames of first images, and the images under registration are the first images apart from the reference image of the N frames of first images; and the registered N frames of first images are synthesized so that a target image is obtained.

Description

A kind of method and device processing image
Technical field
The present invention relates to image processing field, particularly relate to a kind of method and the device that process image.
Background technology
Along with the high speed development of smart mobile phone, people are also more and more higher to the requirement of the camera function of smart mobile phone, at present the camera function of a lot of smart mobile phone can meet the requirement in the shooting situations such as outdoor scene, portrait, micro-distance, night, indoor illumination are insufficient substantially, in technique for taking quite ripe, but the shooting for scenery such as the starry sky at night, the fireworks at night, the shooting effect of smart mobile phone does not also reach the effect of mm professional camera special.
Due to the illumination condition more complicated at night, make the photo of shooting fuzzyyer, and when taking the scenery such as starry sky, fireworks, it is necessary to adopt the mode of time exposure to increase the probability of the excellent photo of shooting, when adopting the mode of time exposure, preferably can so that smart mobile phone reduces vibration, to reduce noise, but, for the user of smart mobile phone, due to the movement of user, the photo of shooting also can be made fuzzyyer.
Summary of the invention
Embodiments of the invention provide a kind of method and device processing image, in order to improve the sharpness of the image got.
For achieving the above object, embodiments of the invention adopt following technical scheme:
First aspect, it is provided that a kind of method processing image, comprising:
Obtaining the N frame blurred image of object scene, N >=2, N is integer;
To the every frame blurred image pre-treatment in N frame blurred image, obtain N frame first image, corresponding frame first image of a frame blurred image;
Extracting 3 stack features points of N frame first image, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point;
Take reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, described reference picture is any frame first image in described N frame first image, and described image subject to registration is the first image except described reference picture in described N frame first image;
By N frame first Images uniting after registration, obtain target image.
Optionally, to the every frame blurred image pre-treatment in N frame blurred image, obtain N frame first image, comprising:
Every frame blurred image in N frame blurred image is gone fuzzy, obtains N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image;
To the every frame middle graph in N frame middle graph picture as denoising sound, obtain N frame first image, corresponding frame first image of a frame middle graph picture.
Optionally, obtain the every frame blurred image in the N frame blurred image of object scene, comprising:
Performing repeatedly sub-exposure process and obtain this frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
Optionally, the frame blurred image in N frame blurred image is gone fuzzy, comprising:
Fuzzy degree according to this frame blurred image, it is determined that the point spread function of this frame blurred image;
Determine that multiple is put in contracting according to coding aperture, and put multiple according to contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
According to point spread function and the Deconvolution Algorithm Based on Frequency preset, the 2nd image is gone fuzzy, and by go fuzzy after the 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
Optionally, to the frame middle graph in N frame middle graph picture as denoising sound, comprising:
Adopt median filter method to remove the noise in this frame middle graph picture, obtain the first image that this frame middle graph picture is corresponding.
Further, adopt median filter method to remove the noise in this frame middle graph picture, comprising:
Choosing the filter window that size is m �� m, m >=3, m is odd number;
Glide filter window in image between in the frame, often slip filter window calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for intermediate value.
Further, calculate the intermediate value of the gray-scale value of the pixel in this filter window, comprising:
Calculate the mean value of the gray-scale value of M pixel in this filter window, M=m �� m;
Judge that whether the gray-scale value of M pixel is all equal with the mean value of the gray-scale value of M pixel;
If, it is determined that the gray-scale value of any one pixel in M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less than
Employing the method can determine the intermediate value of the gray-scale value of the pixel in filter window fast.
Optionally, extract 3 stack features points of N frame first image, comprising:
Calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient;
The curvature of at least three unique points according to described reference picture and gradient determine the unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
Second aspect, it is provided that a kind of device processing image, comprising:
Acquiring unit, for obtaining the N frame blurred image of object scene, N >=2, N is integer;
Pretreatment unit, for the every frame blurred image pre-treatment in N frame blurred image, obtaining N frame first image, corresponding frame first image of a frame blurred image;
Extraction unit, for extracting 3 stack features points of N frame first image, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point;
Registration unit, for taking reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, described reference picture is any frame first image in described N frame first image, and described image subject to registration is the first image except described reference picture in described N frame first image;
Synthesis unit, for by N frame first Images uniting after registration, obtaining target image.
Optionally, pretreatment unit comprises:
Go blur unit, for going fuzzy to the every frame blurred image in N frame blurred image, obtain N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image;
Go element of noise, for the every frame middle graph in N frame middle graph picture as denoising sound, obtain N frame first image, corresponding frame first image of frame middle graph picture.
Optionally, acquiring unit is used for: performs repeatedly sub-exposure process and obtains a frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
Optionally, go blur unit for:
Fuzzy degree according to a frame blurred image, it is determined that the point spread function of this frame blurred image;
Determine that multiple is put in contracting according to coding aperture, and put multiple according to contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
According to point spread function and the Deconvolution Algorithm Based on Frequency preset, the 2nd image is gone fuzzy, and by go fuzzy after the 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
Optionally, go element of noise for:
In employing, value filtering device removes the noise in a frame middle graph picture, obtains the first image that this frame middle graph picture is corresponding.
Further, go element of noise specifically for:
Choosing the filter window that size is m �� m, m >=3, m is odd number;
Glide filter window in image between in a frame, often slip filter window calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for intermediate value.
Further, go element of noise specifically for:
Calculate the mean value of the gray-scale value of M pixel in a filter window, M=m �� m;
Judge that whether the gray-scale value of M pixel is all equal with the mean value of the gray-scale value of M pixel;
If, it is determined that the gray-scale value of any one pixel in M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less than
Employing the method can determine the intermediate value of the gray-scale value of the pixel in filter window fast.
Optionally, extraction unit specifically for:
Calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient;
The curvature of at least three unique points according to described reference picture and gradient determine the unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
The third aspect, it is provided that a kind of device processing image, comprising: storer and treater, stores one group of code in storer, treater performs following action according to this group code:
Obtaining the N frame blurred image of object scene, N >=2, N is integer;
To the every frame blurred image pre-treatment in N frame blurred image, obtain N frame first image, corresponding frame first image of a frame blurred image;
Extracting 3 stack features points of N frame first image, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point;
Take reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, described reference picture is any frame first image in described N frame first image, and described image subject to registration is the first image except described reference picture in described N frame first image;
By N frame first Images uniting after registration, obtain target image.
Optionally, treater specifically for:
Every frame blurred image in N frame blurred image is gone fuzzy, obtains N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image;
To the every frame middle graph in N frame middle graph picture as denoising sound, obtain N frame first image, corresponding frame first image of a frame middle graph picture.
Optionally, treater specifically for:
Performing repeatedly sub-exposure process and obtain a frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
Optionally, treater specifically for:
Fuzzy degree according to a frame blurred image, it is determined that the point spread function of this frame blurred image;
Determine that multiple is put in contracting according to coding aperture, and put multiple according to contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
According to point spread function and the Deconvolution Algorithm Based on Frequency preset, the 2nd image is gone fuzzy, and by go fuzzy after the 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
Optionally, treater specifically for:
In employing, value filtering device removes the noise in a frame middle graph picture, obtains the first image that this frame middle graph picture is corresponding.
Further, treater specifically for:
Choosing the filter window that size is m �� m, m >=3, m is odd number;
Glide filter window in image between in a frame, often slip filter window calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for intermediate value.
Further, treater specifically for:
Calculate the mean value of the gray-scale value of M pixel in a filter window, M=m �� m;
Judge that whether the gray-scale value of M pixel is all equal with the mean value of the gray-scale value of M pixel;
If, it is determined that the gray-scale value of any one pixel in M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less than
Employing the method can determine the intermediate value of the gray-scale value of the pixel in filter window fast.
Optionally, treater specifically for:
Calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient;
The curvature of at least three unique points according to described reference picture and gradient determine the unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
The method that the embodiment of the present invention provides and device, after getting multiframe blurred image, the multiframe blurred image got is carried out pre-treatment, and synthesizes according to the pretreated image of feature point pairs extracted in pretreated image, get target image clearly. Compared with prior art, even if under the night darker scene of light, still image clearly can be got by aforesaid method.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, it is briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, it is also possible to obtain other accompanying drawing according to these accompanying drawings.
The schema of a kind of method processing image that Fig. 1 provides for the embodiment of the present invention;
The schematic diagram of a kind of filter window that Fig. 2 provides for the embodiment of the present invention;
The structural representation of a kind of device processing image that Fig. 3 provides for the embodiment of the present invention;
The structural representation of the device of another process image that Fig. 4 provides for the embodiment of the present invention;
The structural representation of the device of another process image that Fig. 5 provides for the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is only the present invention's part embodiment, instead of whole embodiments. Based on the embodiment in the present invention, those of ordinary skill in the art, not making other embodiments all obtained under creative work prerequisite, belong to the scope of protection of the invention.
The embodiment of the present invention provides a kind of method processing image, as shown in Figure 1, comprising:
101, object scene is focused.
The executive agent of the embodiment of the present invention can be terminating unit, is specifically as follows mobile phone, panel computer etc. Object scene can be the starry sky scenery at night, the fireworks at night and the scenery such as the city at night or rural area.
Concrete, object scene is focused namely when object scene is positioned at the visual field scope of imaging lens of terminating unit, determine the focusing position of imaging lens, when determining the focusing position of imaging lens, specifically can determine the focusing position of imaging lens according to Image Definition, refer to following step 11)-14):
11) search coverage [0, L] is determined.
Concrete, search coverage can be the scope pre-set according to the lens parameters in terminating unit.
12) determine T, i and i ', when i+T is when [0, L] is interior, i '=i+T; As i+T < 0, i '=0; As i+T > L, i '=L.
Wherein, T is the step size that imaging lens focuses on, and the initial value of T isI, i ' and it being location parameter, the value that the initial value of i is 0, i ' calculates according to the value of i and T.
13) Image Definition value G (i) at i place and the Image Definition value G (i ') at i ' place is calculated.
14) comparing the size of G (i) and G (i '), as G (i) < G (i '), make i=i ', T is constant, returns step 12); As G (i) >=G (i '),Judge whether T is less than ��, if not, make i=i ', return step 12), if, it is determined that i ' is focusing position.
It should be noted that,In negative sign do not represent the positive and negative of numerical value, only represent the direction of step-length, in the embodiment of the present invention, when step-length for be just worth time, the direction of step-length is the direction of 0 sensing L, when step-length is negative value, the direction of step-length be L point to 0 direction.
Wherein, �� is step-length constraint condition, it is possible to be preset value.
Image Definition is specifically as follows Laplace gradient edge checking and appraising function, Sobel edge edge checking and appraising function, Prewitt rim detection evaluates function or Roberts rim detection evaluates function etc.
Wherein, the imaging lens that the imaging lens in terminating unit can be able to be moved for being driven by motor, thus realize the zoom function of imaging lens.
102, the N frame blurred image of object scene is obtained.
Concrete, N >=2, N is integer, and the opening and closing of an aperture of the camera in terminating unit can obtain a frame blurred image.
It should be noted that, after camera is opened, the periodic opening and closing of aperture, can obtain multiframe blurred image, and N frame blurred image may be the N frame blurred image in this multiframe blurred image.
Optionally, obtain the every frame blurred image in the N frame blurred image of object scene, comprise: perform repeatedly sub-exposure process and obtain this frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
Concrete, there are several binary code sequences, then single exposure process is split as a few second son exposure process, the length of binary code sequence is consistent with the number of the notch on aperture, it should be noted that, aperture in the embodiment of the present invention is the aperture redesigned, and on aperture, layout has at least 2 notches.
Exemplary, if when having 4 notches on aperture and need single exposure process is split as 3 exposure process, then 3 binary code sequences are respectively 0011,0101,1010. Wherein, the corresponding notch in a bit position in each binary code sequence, when the value of bit position is different, the state representing notch is different, the state of notch comprises and opens or closes two states, can the state of the value of predetermined bit position notch when being 1 (or 0) for opening state, then when the value of bit position is 0 (or 1), the state of notch is closing condition.
103, to the every frame blurred image pre-treatment in described N frame blurred image, N frame first image is obtained,
Wherein, corresponding frame first image of a frame blurred image.
Concrete, it is possible to adopt method of the prior art to every frame blurred image pre-treatment, it is also possible to adopt the method for embodiment of the present invention offer to every frame blurred image pre-treatment.
Optionally, step 103, when specific implementation, comprising:
21) the every frame blurred image in described N frame blurred image is gone fuzzy, obtain N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image.
22) to the every frame middle graph in described N frame middle graph picture as denoising sound, N frame first image is obtained, corresponding frame first image of a frame middle graph picture.
It should be noted that, one frame blurred image go fuzzy and denoising sound be separate process, but, owing to removing fuzzy process may be introduced noise, therefore, preferably, first every frame blurred image is gone fuzzy after again to this frame blurred image denoising sound, it is to increase the picture quality of the first image obtained.
Optionally, the frame blurred image in described N frame blurred image is gone fuzzy method, comprising:
31) according to the fuzzy degree of this frame blurred image, it is determined that the point spread function of this frame blurred image;
32) determine that multiple is put in contracting according to coding aperture, and put multiple according to described contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
33) according to described point spread function and the Deconvolution Algorithm Based on Frequency preset, described 2nd image is gone fuzzy, and by go fuzzy after described 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
Optionally, in step 31) before, the method can also comprise: the fuzzy degree estimating this frame blurred image.
Concrete, it is possible to according to the fuzzy degree of Radon conversion this frame blurred image of estimation.
Wherein, contracting is put multiple and can be greater than 1, be less than 1 or equal 1, coding aperture refers to the diameter of the notch on aperture, above-mentioned binary coding is namely to the coding of the notch of aperture, and default Deconvolution Algorithm Based on Frequency can be the algorithm such as Richardson-Lucy (being called for short RL) or Wiener filtering recovery.
Optionally, to the frame middle graph in described N frame middle graph picture as denoising sound, comprising: adopt median filter method to remove the noise in this frame middle graph picture, obtain the first image that this frame middle graph picture is corresponding.
Concrete, the noise in middle graph picture is mainly ISO noise, also comprises Gaussian noise, impulse noise etc.
Optionally, above-mentioned employing median filter method removes the method for the noise in this frame middle graph picture, comprising:
41) choosing the filter window that size is m �� m, m >=3, m is odd number;
42) in the frame between image slides described filter window, the filter window of often sliding calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for described intermediate value.
Concrete, the pixel in filter window refers to the whole pixel in filter window.
Wherein, the size of filter window is generally 3 �� 3,5 �� 5 or 7 �� 7, specifically can according to the size of the size Choose filtering window of image.
Concrete, median filter method can be the Fast Median Filtering method accelerated based on average, and the method can improve the speed of image filtering.
Optionally, the intermediate value of the gray-scale value of the pixel in this filter window of above-mentioned calculating, comprising:
51) mean value of the gray-scale value of M pixel in this filter window is calculated, M=m �� m;
52) judge that whether the gray-scale value of described M pixel is all equal with the mean value of the gray-scale value of described M pixel;
If, it is determined that the gray-scale value of any one pixel in described M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, described M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is described intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is described intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less thanIn this situation, all it is less than according to the number of the pixel in above-mentioned two setSituation determine the intermediate value of gray-scale value of the pixel in this filter window.
It should be noted that,For getting whole symbol downwards. Such as,
Concrete, when determining certain pixel in a set according to the order of (or from small to large) from large to small, can first adopt quick sort by the gray-scale value of the pixel in this set from large to small (or from small to large) sort, determine this pixel according to ranking results.
Adopt this optional method, it is possible to determine the intermediate value of the gray-scale value of the pixel in filter window fast.
Exemplary, for the size of filter window as 3 �� 3, the number of the pixel in a filter window is 9, as shown in Figure 2, the digital 0-8 in filter window in image (size of the image shown in Fig. 2 is 6 �� 8) represents 9 pixels, represent the gray-scale value of this pixel of digitized representation in the bracket below the numeral of respective pixel point, then determine that the process of the intermediate value of the gray-scale value of the pixel in this filter window is as follows:
Calculate the mean value of the gray-scale value of these 9 pixels, namely
5 + 175 + 182 + 170 + 188 + 193 + 186 + 191 + 195 9 = 165 ; The gray-scale value of 9 pixels is sorted from small to large, ranking results is: 5-170-175-182-186-188-191-193-195, according to ranking results, 9 pixels are divided into set A and set B, the gray-scale value of the pixel in set A is more than or equal to 165, the gray-scale value of the pixel in set B is less than 165, then set A comprises 8 pixels, and 8 pixels are pixel 1 to pixel 8, set B comprises 1 pixel, is pixel 0; (namely the number of the pixel comprised due to set A is greater than 4.5), then the mean value of the gray-scale value of 8 pixels in set of computations A, namely
175 + 182 + 170 + 188 + 193 + 186 + 191 + 195 8 = 185 , 8 pixels in set A are divided into set A1And set A2, set A1In the gray-scale value of pixel be more than or equal to 185, set A2In the gray-scale value of pixel be less than 185, then set A1Comprising 5 pixels, 5 pixels are pixel 4 to pixel 8, set A2Comprising 3 pixels, 3 pixels are pixel 1 to pixel 3; Due to set A1The number of the pixel comprised is greater than 4.5, then set of computations A1In the mean value of gray-scale value of 5 pixels, namelyBy set A1In 5 pixels be divided into set A11And set A12, set A11In the gray-scale value of pixel be more than or equal to 190.6, set A12In the gray-scale value of pixel be less than 190.6, then set A11Comprising 3 pixels, 3 pixels are pixel 5, pixel 7 and pixel 8, set A12Comprising 2 pixels, 2 pixels are pixel 4 and pixel 6, due to set A11And set A12The number of the pixel comprised all is less than 4.5, and set A11The number of the pixel comprised is greater than set A12The number of the pixel comprised, then determine set A12In arrange from large to smallThe gray-scale value of individual pixel (i.e. the 2nd pixel) is the intermediate value of the gray-scale value of the pixel in this filter window, and namely the gray-scale value of pixel 6 is the intermediate value of the gray-scale value of the pixel in this filter window.
104,3 stack features points of described N frame first image are extracted.
Wherein, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point.
Optionally, step 104 can comprise when specific implementation:
61) calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient.
62) unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration is determined according to the curvature of at least three unique points of described reference picture and gradient, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
Exemplary, when there being 3 frame the first image, the 4 stack features points got can be as shown in table 1, wherein, and X11��X12��X13And X14It is 4 unique points of the 1st frame first image, X21��X22��X23And X24It is 4 unique points of the 2nd frame first image, X31��X32��X33And X34It is 4 unique points of the 3rd frame first image.
Table 1
First image 1st stack features point 2nd stack features point 3rd stack features point 4th stack features point
1st frame X11 X12 X13 X14
2nd frame X21 X22 X23 X24
3rd frame X31 X32 X33 X34
Concrete, after the curvature and gradient of each pixel calculated in reference picture, it is possible to curvature is greater than the first predetermined threshold value and pixel that gradient is greater than the 2nd predetermined threshold value is defined as unique point. The value of the first predetermined threshold value and the 2nd predetermined threshold value can be determined according to practical application scene, and this is not limited by the embodiment of the present invention.
Accordingly, it is possible to by the unique point that pixel identical with Grad with the curvature value of the unique point in reference picture to the curvature value in image subject to registration and Grad is defined as in image subject to registration.
105, take reference picture as benchmark, according to the unique point of reference picture described in the unique point of frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration.
Wherein, reference picture is any frame first image in N frame first image, and image subject to registration is the first image except reference picture in N frame first image.
Based on the example described in table 1, if the 2nd frame first image is reference picture, when a frame image subject to registration is the 1st frame the first image, then according to X11��X12��X13��X14And X21��X22��X23��X241st frame first image is carried out registration.
Optionally, step 105 can comprise when specific implementation:
71) take reference picture as benchmark, estimate, according to the unique point of reference picture described in the unique point of frame image subject to registration every in described 3 stack features points and described 3 stack features points and robust iterative method, the spatial variations parameter that every frame image subject to registration is corresponding.
72) every frame image subject to registration is converted by corresponding according to every frame image subject to registration spatial variations parameter.
Concrete, spatial variations parameter can be angle parameter and distance parameter etc.
106, by N frame first Images uniting after registration, target image is obtained.
Concrete, it is possible to adopt image fusion technology to be synthesized by N frame first image, obtain target image.
Aforesaid method is in the process of specific implementation, and after the camera in terminating unit is opened, terminating unit can adopt aforesaid method to obtain an image clearly according to the multiframe blurred image got in real time, for user presents.
The method that the embodiment of the present invention provides, after getting multiframe blurred image, the multiframe blurred image got is carried out pre-treatment, and synthesizes according to the pretreated image of feature point pairs extracted in pretreated image, get target image clearly. Compared with prior art, even if under the night darker scene of light, still image clearly can be got by aforesaid method.
The embodiment of the present invention also provides a kind of device 30 processing image, for performing aforesaid method, as shown in Figure 3, comprising:
Acquiring unit 301, for obtaining the N frame blurred image of object scene, N >=2, N is integer;
Pretreatment unit 302, for the every frame blurred image pre-treatment in described N frame blurred image, obtaining N frame first image, corresponding frame first image of a frame blurred image;
Extraction unit 303, for extracting 3 stack features points of described N frame first image, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point;
Registration unit 304, for taking reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, described reference picture is any frame first image in described N frame first image, and described image subject to registration is the first image except described reference picture in described N frame first image;
Synthesis unit 305, for by N frame first Images uniting after registration, obtaining target image.
Optionally, as shown in Figure 4, described pretreatment unit 302 comprises:
Go blur unit 3021, for going fuzzy to the every frame blurred image in described N frame blurred image, obtain N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image;
Go element of noise 3022, for the every frame middle graph in described N frame middle graph picture as denoising sound, obtain N frame first image, corresponding frame first image of frame middle graph picture.
Optionally, described acquiring unit 301 for: perform repeatedly sub-exposure process obtain a frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
Optionally, go described in blur unit 3021 for:
Fuzzy degree according to a frame blurred image, it is determined that the point spread function of this frame blurred image;
Determine that multiple is put in contracting according to coding aperture, and put multiple according to described contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
According to described point spread function and the Deconvolution Algorithm Based on Frequency preset, described 2nd image is gone fuzzy, and by go fuzzy after described 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
Optionally, described in go element of noise 3022, for:
In employing, value filtering device removes the noise in a frame middle graph picture, obtains the first image that this frame middle graph picture is corresponding.
Further, described in go element of noise 3022 specifically for:
Choosing the filter window that size is m �� m, m >=3, m is odd number;
Slide in image between in a frame described filter window, and often slip filter window calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for described intermediate value.
Further, described in go element of noise 3022 specifically for:
Calculate the mean value of the gray-scale value of M pixel in a filter window, M=m �� m;
Judge that whether the gray-scale value of described M pixel is all equal with the mean value of the gray-scale value of described M pixel;
If, it is determined that the gray-scale value of any one pixel in described M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, described M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is described intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is described intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less than
The method can determine the intermediate value of the gray-scale value of the pixel in this filter window fast.
Optionally, described extraction unit 303 specifically for:
Calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient;
The curvature of at least three unique points according to described reference picture and gradient determine the unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
Optionally, described registration unit 304 specifically for:
Take reference picture as benchmark, estimate, according to the unique point of reference picture described in the unique point of frame image subject to registration every in described 3 stack features points and described 3 stack features points and robust iterative method, the spatial variations parameter that every frame image subject to registration is corresponding;
Every frame image subject to registration is converted by the spatial variations parameter corresponding according to every frame image subject to registration.
The device that the embodiment of the present invention provides, after getting multiframe blurred image, the multiframe blurred image got is carried out pre-treatment, and synthesizes according to the pretreated image of feature point pairs extracted in pretreated image, get target image clearly. Compared with prior art, even if under the night darker scene of light, still image clearly can be got by aforesaid method.
On hardware implementing, process image device 30 in each unit can be embedded in the form of hardware or independent of process image device 30 treater in, can also be stored in the storer of device 30 of process image in a software form, so that treater calls performs operation corresponding to each unit above, this treater can be central processing unit (CentralProcessingUnit, it is called for short CPU), specific unicircuit (ApplicationSpecificIntegratedCircuit, be called for short ASIC) or be configured to implement the embodiment of the present invention one or more unicircuit.
The embodiment of the present invention also provides a kind of device 50 processing image, for performing aforesaid method, as shown in Figure 5, device 50 comprises: storer 501 and treater 502, storing one group of code in described storer 501, described treater 502 performs following action according to this group code:
Obtaining the N frame blurred image of object scene, N >=2, N is integer;
To the every frame blurred image pre-treatment in described N frame blurred image, obtain N frame first image, corresponding frame first image of a frame blurred image;
Extracting 3 stack features points of described N frame first image, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point;
Take reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, described reference picture is any frame first image in described N frame first image, and described image subject to registration is the first image except described reference picture in described N frame first image;
By N frame first Images uniting after registration, obtain target image.
Optionally, described treater 502 specifically for:
Every frame blurred image in described N frame blurred image is gone fuzzy, obtains N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image;
To the every frame middle graph in described N frame middle graph picture as denoising sound, obtain N frame first image, corresponding frame first image of a frame middle graph picture.
Optionally, described treater 502 specifically for: perform repeatedly sub-exposure process obtain a frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
Optionally, described treater 502 specifically for:
Fuzzy degree according to a frame blurred image, it is determined that the point spread function of this frame blurred image;
Determine that multiple is put in contracting according to coding aperture, and put multiple according to described contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
According to described point spread function and the Deconvolution Algorithm Based on Frequency preset, described 2nd image is gone fuzzy, and by go fuzzy after described 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
Optionally, described treater 502 specifically for:
In employing, value filtering device removes the noise in a frame middle graph picture, obtains the first image that this frame middle graph picture is corresponding.
Further, described treater 502 specifically for:
Choosing the filter window that size is m �� m, m >=3, m is odd number;
Slide in image between in a frame described filter window, and often slip filter window calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for described intermediate value.
Further, described treater 502 specifically for:
Calculate the mean value of the gray-scale value of M pixel in a filter window, M=m �� m;
Judge that whether the gray-scale value of described M pixel is all equal with the mean value of the gray-scale value of described M pixel;
If, it is determined that the gray-scale value of any one pixel in described M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, described M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is described intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is described intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less than
Optionally, described treater 502 specifically for:
Calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient;
The curvature of at least three unique points according to described reference picture and gradient determine the unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
Optionally, described treater 502 specifically for:
Take reference picture as benchmark, estimate, according to the unique point of reference picture described in the unique point of frame image subject to registration every in described 3 stack features points and described 3 stack features points and robust iterative method, the spatial variations parameter that every frame image subject to registration is corresponding;
Every frame image subject to registration is converted by the spatial variations parameter corresponding according to every frame image subject to registration.
The device that the embodiment of the present invention provides, after getting multiframe blurred image, the multiframe blurred image got is carried out pre-treatment, and synthesizes according to the pretreated image of feature point pairs extracted in pretreated image, get target image clearly. Compared with prior art, even if under the night darker scene of light, still image clearly can be got by aforesaid method.
In several embodiments that the application provides, it should be appreciated that, disclosed device and method, it is possible to realize by another way. Such as, device embodiment described above is only schematic, such as, the division of described module, being only a kind of logic function to divide, actual can have other dividing mode when realizing, such as multiple module or assembly can in conjunction with or another system can be integrated into, or some features can ignore, or do not perform.
The described module illustrated as separating component can or may not be physically separates, and the parts as module display can be or may not be physics module, namely can be positioned at a place, or can also be distributed on multiple NE. Some or all of unit wherein can be selected according to the actual needs to realize the object of the present embodiment scheme.
In addition, each function module in each embodiment of the present invention can be integrated in a processing module, it is also possible to two or more module integrations are in a module. Above-mentioned integrated module both can adopt the form of hardware to realize, it is also possible to the form adopting hardware to add software function module realizes.
The above-mentioned integrated module realized with the form of software function module, it is possible to be stored in a computer read/write memory medium. Above-mentioned software function module is stored in a storage media, comprise some instructions with so that a computer equipment (can be Personal Computer, server, or the network equipment etc.) perform the part steps of method described in each embodiment of the present invention. And aforesaid storage media comprises: USB flash disk, portable hard drive, read-only storage (Read-OnlyMemory, be called for short ROM), random access memory (RandomAccessMemory, be called for short RAM), magnetic disc or CD etc. various can be program code stored medium.
The above, above embodiment only in order to the technical scheme of the application to be described, is not intended to limit; Although with reference to previous embodiment to present application has been detailed description, it will be understood by those within the art that: the technical scheme described in foregoing embodiments still can be modified by it, or wherein part technology feature is carried out equivalent replacement; And these amendments or replacement, do not make the spirit and scope of the essence disengaging each embodiment technical scheme of the application of appropriate technical solution.

Claims (27)

1. one kind processes the method for image, it is characterised in that, comprising:
Obtaining the N frame blurred image of object scene, N >=2, N is integer;
To the every frame blurred image pre-treatment in described N frame blurred image, obtain N frame first image, corresponding frame first image of a frame blurred image;
Extracting 3 stack features points of described N frame first image, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point;
Take reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, described reference picture is any frame first image in described N frame first image, and described image subject to registration is the first image except described reference picture in described N frame first image;
By N frame first Images uniting after registration, obtain target image.
2. method according to claim 1, it is characterised in that, described to the every frame blurred image pre-treatment in described N frame blurred image, obtain N frame first image, comprising:
Every frame blurred image in described N frame blurred image is gone fuzzy, obtains N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image;
To the every frame middle graph in described N frame middle graph picture as denoising sound, obtain N frame first image, corresponding frame first image of a frame middle graph picture.
3. method according to claim 1 and 2, it is characterised in that, obtain the every frame blurred image in the N frame blurred image of object scene, comprising:
Performing repeatedly sub-exposure process and obtain this frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
4. method according to claim 2, it is characterised in that, the frame blurred image in described N frame blurred image is gone fuzzy, comprising:
Fuzzy degree according to this frame blurred image, it is determined that the point spread function of this frame blurred image;
Determine that multiple is put in contracting according to coding aperture, and put multiple according to described contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
According to described point spread function and the Deconvolution Algorithm Based on Frequency preset, described 2nd image is gone fuzzy, and by go fuzzy after described 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
5. method according to the arbitrary item of claim 2-4, it is characterised in that, to the frame middle graph in described N frame middle graph picture as denoising sound, comprising:
Adopt median filter method to remove the noise in this frame middle graph picture, obtain the first image that this frame middle graph picture is corresponding.
6. method according to claim 5, it is characterised in that, described employing median filter method removes the noise in this frame middle graph picture, comprising:
Choosing the filter window that size is m �� m, m >=3, m is odd number;
Slide in image between in the frame described filter window, and often slip filter window calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for described intermediate value.
7. method according to claim 6, it is characterised in that, the intermediate value of the gray-scale value of the pixel in this filter window of described calculating, comprising:
Calculate the mean value of the gray-scale value of M pixel in this filter window, M=m �� m;
Judge that whether the gray-scale value of described M pixel is all equal with the mean value of the gray-scale value of described M pixel;
If, it is determined that the gray-scale value of any one pixel in described M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, described M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is described intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is described intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less than
8. method according to the arbitrary item of claim 1-7, it is characterised in that, extract 3 stack features points of described N frame first image, comprising:
Calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient;
The curvature of at least three unique points according to described reference picture and gradient determine the unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
9. method according to the arbitrary item of claim 1-8, it is characterized in that, take reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, comprising:
Take reference picture as benchmark, estimate, according to the unique point of reference picture described in the unique point of frame image subject to registration every in described 3 stack features points and described 3 stack features points and robust iterative method, the spatial variations parameter that every frame image subject to registration is corresponding;
Every frame image subject to registration is converted by the spatial variations parameter corresponding according to every frame image subject to registration.
10. one kind processes the device of image, it is characterised in that, comprising:
Acquiring unit, for obtaining the N frame blurred image of object scene, N >=2, N is integer;
Pretreatment unit, for the every frame blurred image pre-treatment in described N frame blurred image, obtaining N frame first image, corresponding frame first image of a frame blurred image;
Extraction unit, for extracting 3 stack features points of described N frame first image, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point;
Registration unit, for taking reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, described reference picture is any frame first image in described N frame first image, and described image subject to registration is the first image except described reference picture in described N frame first image;
Synthesis unit, for by N frame first Images uniting after registration, obtaining target image.
11. devices according to claim 10, it is characterised in that, described pretreatment unit comprises:
Go blur unit, for going fuzzy to the every frame blurred image in described N frame blurred image, obtain N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image;
Go element of noise, for the every frame middle graph in described N frame middle graph picture as denoising sound, obtain N frame first image, corresponding frame first image of frame middle graph picture.
12. devices according to claim 10 or 11, it is characterized in that, described acquiring unit is used for: performs repeatedly sub-exposure process and obtains a frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
13. devices according to claim 11, it is characterised in that, described in go blur unit for:
Fuzzy degree according to a frame blurred image, it is determined that the point spread function of this frame blurred image;
Determine that multiple is put in contracting according to coding aperture, and put multiple according to described contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
According to described point spread function and the Deconvolution Algorithm Based on Frequency preset, described 2nd image is gone fuzzy, and by go fuzzy after described 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
14. devices according to the arbitrary item of claim 11-13, it is characterised in that, described in go element of noise, for:
In employing, value filtering device removes the noise in a frame middle graph picture, obtains the first image that this frame middle graph picture is corresponding.
15. devices according to claim 14, it is characterised in that, described in go element of noise specifically for:
Choosing the filter window that size is m �� m, m >=3, m is odd number;
Slide in image between in a frame described filter window, and often slip filter window calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for described intermediate value.
16. devices according to claim 15, it is characterised in that, described in go element of noise specifically for:
Calculate the mean value of the gray-scale value of M pixel in a filter window, M=m �� m;
Judge that whether the gray-scale value of described M pixel is all equal with the mean value of the gray-scale value of described M pixel;
If, it is determined that the gray-scale value of any one pixel in described M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, described M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is described intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is described intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less than
17. devices according to the arbitrary item of claim 10-16, it is characterised in that, described extraction unit specifically for:
Calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient;
The curvature of at least three unique points according to described reference picture and gradient determine the unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
18. devices according to the arbitrary item of claim 10-17, it is characterised in that, described registration unit specifically for:
Take reference picture as benchmark, estimate, according to the unique point of reference picture described in the unique point of frame image subject to registration every in described 3 stack features points and described 3 stack features points and robust iterative method, the spatial variations parameter that every frame image subject to registration is corresponding;
Every frame image subject to registration is converted by the spatial variations parameter corresponding according to every frame image subject to registration.
19. 1 kinds process the device of image, it is characterised in that, comprising: storer and treater, described storer stores one group of code, described treater performs following action according to this group code:
Obtaining the N frame blurred image of object scene, N >=2, N is integer;
To the every frame blurred image pre-treatment in described N frame blurred image, obtain N frame first image, corresponding frame first image of a frame blurred image;
Extracting 3 stack features points of described N frame first image, every stack features point comprises N number of unique point, Feature point correspondence one frame first image in every stack features point;
Take reference picture as benchmark, the unique point of reference picture described in unique point according to frame image subject to registration every in described 3 stack features points and described 3 stack features points, this frame image subject to registration is carried out registration, described reference picture is any frame first image in described N frame first image, and described image subject to registration is the first image except described reference picture in described N frame first image;
By N frame first Images uniting after registration, obtain target image.
20. devices according to claim 19, it is characterised in that, described treater specifically for:
Every frame blurred image in described N frame blurred image is gone fuzzy, obtains N frame middle graph picture, the corresponding frame middle graph picture of a frame blurred image;
To the every frame middle graph in described N frame middle graph picture as denoising sound, obtain N frame first image, corresponding frame first image of a frame middle graph picture.
21. devices according to claim 19 or 20, it is characterized in that, described treater specifically for: perform repeatedly sub-exposure process obtain a frame blurred image, repeatedly sub-exposure process is the repeatedly sub-exposure process obtained after adopting multiple binary code sequence single exposure process to be split, a corresponding second son exposure process of binary code sequence.
22. devices according to claim 20, it is characterised in that, described treater specifically for:
Fuzzy degree according to a frame blurred image, it is determined that the point spread function of this frame blurred image;
Determine that multiple is put in contracting according to coding aperture, and put multiple according to described contracting and adopt interpolation algorithm this frame blurred image is carried out contracting to put, obtain the 2nd image;
According to described point spread function and the Deconvolution Algorithm Based on Frequency preset, described 2nd image is gone fuzzy, and by go fuzzy after described 2nd image restoring be the size of this frame blurred image, obtain the middle graph picture that this frame blurred image is corresponding.
23. devices according to the arbitrary item of claim 20-22, it is characterised in that, described treater specifically for:
In employing, value filtering device removes the noise in a frame middle graph picture, obtains the first image that this frame middle graph picture is corresponding.
24. devices according to claim 23, it is characterised in that, described treater specifically for:
Choosing the filter window that size is m �� m, m >=3, m is odd number;
Slide in image between in a frame described filter window, and often slip filter window calculates the intermediate value of the gray-scale value of the pixel in this filter window, and replaces the gray-scale value of the pixel in this filter window for described intermediate value.
25. devices according to claim 24, it is characterised in that, described treater specifically for:
Calculate the mean value of the gray-scale value of M pixel in a filter window, M=m �� m;
Judge that whether the gray-scale value of described M pixel is all equal with the mean value of the gray-scale value of described M pixel;
If, it is determined that the gray-scale value of any one pixel in described M pixel is the intermediate value of the gray-scale value of the pixel in this filter window;
If not, described M pixel is divided into two set, judges whether the number of the pixel in these two set is all less thanIf when the number of the pixel that the number of pixel in a set in these two set is more than or equal in another set in these two set, determining the in this another set according to order from large to smallThe gray-scale value of individual pixel is described intermediate value, and s is the number of pixel in this set, when the number of the pixel that the number of the pixel in this set is less than in this another set, determines the in this set according to order from small to largeThe gray-scale value of individual pixel is described intermediate value, and t is the number of the pixel in this another set; If not, the pixel in the set that the number of the pixel comprised in gathering these two is more continues to be divided into two set, until the number of the pixel in finally separate two set is all less than
26. devices according to the arbitrary item of claim 19-25, it is characterised in that, described treater specifically for:
Calculate curvature and the gradient of each pixel in described reference picture, determine at least three unique points of described reference picture according to the curvature of each pixel and gradient;
The curvature of at least three unique points according to described reference picture and gradient determine the unique point of each Feature point correspondence at least three unique points with described reference picture in every frame image subject to registration, obtaining at least three unique points of every frame image subject to registration, the set of the unique point with this Feature point correspondence in a unique point of described reference picture and whole image subject to registration is a stack features point.
27. devices according to the arbitrary item of claim 19-26, it is characterised in that, described treater specifically for:
Take reference picture as benchmark, estimate, according to the unique point of reference picture described in the unique point of frame image subject to registration every in described 3 stack features points and described 3 stack features points and robust iterative method, the spatial variations parameter that every frame image subject to registration is corresponding;
Every frame image subject to registration is converted by the spatial variations parameter corresponding according to every frame image subject to registration.
CN201511009631.0A 2015-12-29 2015-12-29 Image processing method and device Pending CN105631828A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201511009631.0A CN105631828A (en) 2015-12-29 2015-12-29 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201511009631.0A CN105631828A (en) 2015-12-29 2015-12-29 Image processing method and device

Publications (1)

Publication Number Publication Date
CN105631828A true CN105631828A (en) 2016-06-01

Family

ID=56046716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201511009631.0A Pending CN105631828A (en) 2015-12-29 2015-12-29 Image processing method and device

Country Status (1)

Country Link
CN (1) CN105631828A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107036629A (en) * 2017-04-20 2017-08-11 武汉大学 The in-orbit relative radiometric calibration method and system of video satellite
CN107154028A (en) * 2017-04-19 2017-09-12 珠海市魅族科技有限公司 Image denoising method and device
CN107170002A (en) * 2017-05-04 2017-09-15 中国科学院微电子研究所 A kind of image Atomatic focusing method and equipment
WO2018076942A1 (en) * 2016-10-26 2018-05-03 努比亚技术有限公司 Method and apparatus for implementing slow shutter photography
CN107992780A (en) * 2017-10-31 2018-05-04 维沃移动通信有限公司 A kind of code recognition method and mobile terminal
CN108510560A (en) * 2018-04-11 2018-09-07 腾讯科技(深圳)有限公司 Image processing method, device, storage medium and computer equipment
CN108596222A (en) * 2018-04-11 2018-09-28 西安电子科技大学 Image interfusion method based on deconvolution neural network
CN108830798A (en) * 2018-04-23 2018-11-16 西安电子科技大学 Improved image denoising method based on propagation filter
CN109190571A (en) * 2018-09-12 2019-01-11 内蒙古大学 A kind of detection recognition method and its device of grazing sheep feeding typical plant type
CN109919220A (en) * 2019-03-04 2019-06-21 北京字节跳动网络技术有限公司 Method and apparatus for generating the feature vector of video
CN109934142A (en) * 2019-03-04 2019-06-25 北京字节跳动网络技术有限公司 Method and apparatus for generating the feature vector of video
CN110415191A (en) * 2019-07-31 2019-11-05 西安第六镜网络科技有限公司 A kind of image deblurring algorithm based on successive video frames
CN113160103A (en) * 2021-04-22 2021-07-23 展讯通信(上海)有限公司 Image processing method and device, storage medium and terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867297A (en) * 2012-08-31 2013-01-09 天津大学 Digital processing method for low-illumination image acquisition
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor
US20150304538A1 (en) * 2014-04-22 2015-10-22 Shih-Chieh Huang Device for synthesizing high dynamic range image based on per-pixel exposure mapping and method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor
CN102867297A (en) * 2012-08-31 2013-01-09 天津大学 Digital processing method for low-illumination image acquisition
US20150304538A1 (en) * 2014-04-22 2015-10-22 Shih-Chieh Huang Device for synthesizing high dynamic range image based on per-pixel exposure mapping and method thereof

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
华顺刚 等: "同一场景不同曝光图像的配准及HDR图像合成", 《计算机辅助设计与图形学学报》 *
章毓晋: "《图像工程(下册) 图像理解》", 31 December 2012 *
葛成 等: "基于SURF特征的高动态范围图像配准算法", 《微型电脑应用》 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018076942A1 (en) * 2016-10-26 2018-05-03 努比亚技术有限公司 Method and apparatus for implementing slow shutter photography
CN107995432A (en) * 2016-10-26 2018-05-04 努比亚技术有限公司 A kind of method and apparatus realized slow door and taken pictures
CN107154028A (en) * 2017-04-19 2017-09-12 珠海市魅族科技有限公司 Image denoising method and device
CN107036629A (en) * 2017-04-20 2017-08-11 武汉大学 The in-orbit relative radiometric calibration method and system of video satellite
CN107170002A (en) * 2017-05-04 2017-09-15 中国科学院微电子研究所 A kind of image Atomatic focusing method and equipment
CN107170002B (en) * 2017-05-04 2020-07-21 中国科学院微电子研究所 Automatic image focusing method and device
CN107992780A (en) * 2017-10-31 2018-05-04 维沃移动通信有限公司 A kind of code recognition method and mobile terminal
CN107992780B (en) * 2017-10-31 2021-05-28 维沃移动通信有限公司 Code identification method and mobile terminal
CN108510560B (en) * 2018-04-11 2020-01-24 腾讯科技(深圳)有限公司 Image processing method, image processing device, storage medium and computer equipment
CN108510560A (en) * 2018-04-11 2018-09-07 腾讯科技(深圳)有限公司 Image processing method, device, storage medium and computer equipment
CN108596222A (en) * 2018-04-11 2018-09-28 西安电子科技大学 Image interfusion method based on deconvolution neural network
CN108596222B (en) * 2018-04-11 2021-05-18 西安电子科技大学 Image fusion method based on deconvolution neural network
CN108830798A (en) * 2018-04-23 2018-11-16 西安电子科技大学 Improved image denoising method based on propagation filter
CN109190571A (en) * 2018-09-12 2019-01-11 内蒙古大学 A kind of detection recognition method and its device of grazing sheep feeding typical plant type
CN109934142A (en) * 2019-03-04 2019-06-25 北京字节跳动网络技术有限公司 Method and apparatus for generating the feature vector of video
CN109919220A (en) * 2019-03-04 2019-06-21 北京字节跳动网络技术有限公司 Method and apparatus for generating the feature vector of video
CN109934142B (en) * 2019-03-04 2021-07-06 北京字节跳动网络技术有限公司 Method and apparatus for generating feature vectors of video
CN110415191A (en) * 2019-07-31 2019-11-05 西安第六镜网络科技有限公司 A kind of image deblurring algorithm based on successive video frames
CN113160103A (en) * 2021-04-22 2021-07-23 展讯通信(上海)有限公司 Image processing method and device, storage medium and terminal

Similar Documents

Publication Publication Date Title
CN105631828A (en) Image processing method and device
CN106934397B (en) Image processing method and device and electronic equipment
Levin Blind motion deblurring using image statistics
KR102056073B1 (en) Image deblurring network processing methods and systems
Tiwari et al. Review of motion blur estimation techniques
US20160048952A1 (en) Algorithm and device for image processing
EP2947627B1 (en) Light field image depth estimation
CN111091091A (en) Method, device and equipment for extracting target object re-identification features and storage medium
US9530079B2 (en) Point spread function classification using structural properties
US20140140626A1 (en) Edge Direction and Curve Based Image De-Blurring
EP3371741B1 (en) Focus detection
CN111325667A (en) Image processing method and related product
CN108234826B (en) Image processing method and device
Nah et al. Clean images are hard to reblur: Exploiting the ill-posed inverse task for dynamic scene deblurring
US20180089809A1 (en) Image deblurring with a multiple section, regularization term
CN108810319B (en) Image processing apparatus, image processing method, and program
CN112801890B (en) Video processing method, device and equipment
CN107833232B (en) Image detail extraction method and device, electronic equipment and computer storage medium
KR101527962B1 (en) method of detecting foreground in video
Queiroz et al. Image deblurring using maps of highlights
EP3101592A1 (en) Method and apparatus for scoring an image according to aesthetics
CN112672052A (en) Image data enhancement method and system, electronic equipment and storage medium
Dhanakshirur et al. Evidence based feature selection and collaborative representation towards learning based PSF estimation for motion deblurring
JP4177128B2 (en) Radar image processing device
CN111344736A (en) Image processing method, image processing device and unmanned aerial vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160601