CN105654470B - Image choosing method, apparatus and system - Google Patents

Image choosing method, apparatus and system Download PDF

Info

Publication number
CN105654470B
CN105654470B CN201510988384.7A CN201510988384A CN105654470B CN 105654470 B CN105654470 B CN 105654470B CN 201510988384 A CN201510988384 A CN 201510988384A CN 105654470 B CN105654470 B CN 105654470B
Authority
CN
China
Prior art keywords
image
value
weight map
wait choose
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510988384.7A
Other languages
Chinese (zh)
Other versions
CN105654470A (en
Inventor
侯文迪
王百超
曲雯雯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201510988384.7A priority Critical patent/CN105654470B/en
Publication of CN105654470A publication Critical patent/CN105654470A/en
Application granted granted Critical
Publication of CN105654470B publication Critical patent/CN105654470B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The disclosure is directed to a kind of image choosing methods, apparatus and system, this method comprises: calculating the luminance mean value wait choose each image in image, the brightness evaluation value of image is calculated according to luminance mean value;It calculates wait choose first weight map mean value of each image under time domain in image, it calculates wait choose energy occupation ratio of each image under frequency domain in image, the clarity weight map wait choose each image in image is calculated according to the first weight map mean value and energy occupation ratio;The Fusion Features value that image is determined according to brightness evaluation value and clarity weight map, will be wait choose in image the maximum image of Fusion Features value as final selection image.The luminance information and sharpness information of disclosure combination image, select the image of optimal quality under Same Scene.

Description

Image choosing method, apparatus and system
Technical field
This disclosure relates to image processing techniques more particularly to image choosing method, apparatus and system.
Background technique
In daily life, it takes pictures and has become a kind of habit of people.However, multiple may be taken under Same Scene Photo, due to the difference of shooting condition, the quality of the plurality of pictures under Same Scene is different, needs to choose one from plurality of pictures The picture for opening optimal quality shows, and the figure of an optimal quality is chosen especially from the similar pictures under Same Scene Piece.
Summary of the invention
To overcome the problems in correlation technique, the disclosure provides a kind of image choosing method, apparatus and system.
According to the first aspect of the embodiments of the present disclosure, a kind of image choosing method is provided, comprising:
The luminance mean value wait choose each image in image is calculated, is commented according to the brightness that the luminance mean value calculates image Value;
It calculates wait choose first weight map mean value of each image under time domain in image, calculates every in image wait choose Energy occupation ratio of one image under frequency domain, calculates according to the first weight map mean value and the energy occupation ratio wait choose The clarity weight map of each image in image;
The Fusion Features value that image is determined according to the brightness evaluation value and the clarity weight map, will image be chosen The middle maximum image of Fusion Features value is as final selection image.
With reference to first aspect, in the first possible implementation of the first aspect, described to calculate wait choose in image The luminance mean value of each image calculates the brightness evaluation value of image according to the luminance mean value, comprising:
Log will be transformed into wait choose each image f (x, y) in image using formula f ' (x, y)=log (f (x, y)+1) Space;
Using formula v=exp (mean (f ' (x, y))) -1 after the space log takes the mean value of image, by the space log to It chooses each image f ' (x, y) in image and goes to the value of luv space as the luminance mean value v;
Formula L is used according to the luminance mean value v1The brightness evaluation value L of=1-abs (v-0.5) calculating image1
Wherein, exp (mean (f ' (x, y))) is indicated using natural constant e as bottom, the index that mean (f ' (x, y)) is index Function, mean (f ' (x, y)) indicate being averaged wait choose each image f ' (x, y) in image to the space log;abs(v- 0.5) it indicates to take absolute value to the difference of the luminance mean value and datum mark 0.5.
The possible implementation of with reference to first aspect the first, in second of possible implementation of first aspect In, it is described to calculate wait choose first weight map mean value of each image under time domain in image, comprising:
At least one region unit will be divided into wait choose each image in image;
Using formulaCalculate the comprehensive difference value w (x) of pixel in each region unit;
Wherein, Xi、XjIndicate that, wait choose the pixel value in image in each image-region block, Ω indicates described wait choose Each image region that all areas block forms under time domain in image, i, j are respectively the positive integer for being more than or equal to 1, and i ≠ j;
Using comprehensive difference value maximum in region unit as the comprehensive difference value of each region unit, all areas block Comprehensive difference value compositing area block comprehensive difference diagram;
Formula is used according to the comprehensive difference diagram of region unitCalculate the first weight map S;
Using formula S1=mean (S) calculates the first weight map mean value S1
Wherein,Indicate the comprehensive difference value v (Ω to pixel in region unit1) maximizing, mean (S) table Show and averages to the first weight map S.
The possible implementation of second with reference to first aspect, in the third possible implementation of first aspect In, it is described using comprehensive difference value maximum in region unit as the comprehensive difference value of each region unit before, further includes:
According to maximum comprehensive difference value in the comprehensive difference value zoning block of pixel in each region unit.
The third possible implementation with reference to first aspect, in the 4th kind of possible implementation of first aspect In, it is described to calculate wait choose energy occupation ratio of each image under frequency domain in image, comprising:
Each of image to be chosen image I is gone to by frequency domain using formula F=fft (I);
Wherein, fft (I) expression treats each of image of selection image I and carries out Fourier transformation;
Each of image to be chosen in frequency domain image is filtered;
Using formula F1=abs (F) > T is greater than in advance after calculating filtering wait choose the energy value F of each image in image If the energy value F of threshold value T1
Wherein, abs (F) is indicated to the energy value F modulus value after filtering wait choose each image in image, preset threshold T=5;
Using formulaIt calculates wait choose energy occupation ratio S of each image under frequency domain in image2
The 4th kind of possible implementation with reference to first aspect, in the 5th kind of possible implementation of first aspect In, it is described to be calculated according to the first weight map mean value and the energy occupation ratio wait choose the clear of each image in image Spend weight map, comprising:
According to the first weight map mean value S1With the energy occupation ratio S2Using formula L2=S1*β+S2* (1- β) is calculated Wait choose the clarity weight map L of each image in image2
Wherein, β is the weighted value of the first weight map mean value.
The 5th kind of possible implementation with reference to first aspect, in the 6th kind of possible implementation of first aspect In, the Fusion Features value that image is determined according to the brightness evaluation value and the clarity weight map, comprising:
According to the brightness evaluation value L1With the clarity weight map L2Using formula L=L1*L2Calculate the feature of image Fusion value L.
With reference to first aspect to the 6th kind of possible implementation of first aspect, the 7th kind in first aspect is possible It is described to calculate wait choose in image before the luminance mean value of each image in implementation, further includes:
Obtain at least two images to be chosen.
According to the second aspect of an embodiment of the present disclosure, a kind of image selecting device is provided, comprising:
First computing module is configured as calculating the luminance mean value wait choose each image in image, according to described bright Degree mean value determines the brightness evaluation value of image;
Second computing module is configured as calculating equal wait choose first weight map of each image under time domain in image Value, calculates wait choose energy occupation ratio of each image under frequency domain in image, according to the first weight map mean value and institute It states energy occupation ratio and calculates clarity weight map wait choose each image in image;
Module is chosen, the brightness evaluation value calculated according to first computing module and second meter are configured as The Fusion Features value that the clarity weight map that module calculates determines image is calculated, it will be maximum wait choose Fusion Features value in image Image as final selection image.
In conjunction with second aspect, in the first possible implementation of the second aspect, first computing module includes:
Space transform subblock, being configured as will be every in image wait choose using formula f ' (x, y)=log (f (x, y)+1) One image f (x, y) is transformed into the space log;
First determines submodule, is configured as converting using formula v=exp (mean (f ' (x, y))) -1 in the space After the space log of submodule conversion takes the mean value of image, by turning wait choose each image f ' (x, y) in image for the space log To luv space value as the luminance mean value v;
First computational submodule is configured as determining that the luminance mean value v that submodule determines is used according to described first Formula L1The brightness evaluation value L of=1-abs (v-0.5) calculating image1
Wherein, exp (mean (f ' (x, y))) is indicated using natural constant e as bottom, the index that mean (f ' (x, y)) is index Function, mean (f ' (x, y)) indicate being averaged wait choose each image f ' (x, y) in image to the space log;abs (v-0.5) it indicates to take absolute value to the difference of the luminance mean value and datum mark 0.5.
In conjunction with the first possible implementation of second aspect, in second of possible implementation of second aspect In, second computing module includes:
Piecemeal submodule is configured as that at least one region unit will be divided into wait choose each image in image;
Piecemeal computational submodule is configured as using formulaThe piecemeal submodule is calculated to draw The comprehensive difference value w (x) of pixel in each region unit divided;
Wherein, Xi、XjIndicate described wait choose the pixel value in image in each image-region block, Ω indicate it is described to Each image region that all areas block forms under time domain in image is chosen, i, j are respectively the positive integer for being more than or equal to 1, And i ≠ j;
Second determines submodule, is configured as using comprehensive difference value maximum in region unit as each region unit Comprehensive difference value, the comprehensive difference diagram of the comprehensive difference value compositing area block of all areas block;
Weight map computational submodule is configured as the comprehensive difference for determining the determining region unit of submodule according to described second Component uses formulaCalculate the first weight map S;
Mean value computation submodule is configured as the first weight map S calculated according to the weight map computational submodule Using formula S1=mean (S) calculates the first weight map mean value S1
Wherein,Indicate the comprehensive difference value v (Ω to pixel in region unit1) maximizing, mean (S) table Show and averages to the first weight map S.
In conjunction with second of possible implementation of second aspect, in the third possible implementation of second aspect In, second computing module further include: maximum value calculation submodule;
Maximum value calculation submodule is configured as described second and determines submodule by comprehensive difference maximum in region unit Before value is as the comprehensive difference value of each region unit, calculated according to the comprehensive difference value of pixel in each region unit Maximum comprehensive difference value in region unit.
In conjunction with the third possible implementation of second aspect, in the 4th kind of possible implementation of second aspect In, second computing module further include:
Frequency-domain transform submodule is configured as using formula F=fft (I) by each of image to be chosen image I Go to frequency domain;
Wherein, fft (I) expression treats each of image of selection image I and carries out Fourier transformation;
Submodule is filtered, is configured as every in image to be chosen in the frequency domain for converting the frequency-domain transform submodule One image is filtered;
Energy value computational submodule is configured as using formula F1=abs (F) > T calculates the filtering submodule filtering The energy value F for being greater than preset threshold T wait choose the energy value F of each image in image afterwards1
Wherein, abs (F) is indicated to the energy value F modulus value after filtering wait choose each image in image, preset threshold T=5;
Occupation ratio computational submodule is configured as the energy value F calculated according to the energy value submodule1Using formulaIt calculates wait choose energy occupation ratio S of each image under frequency domain in image2
In conjunction with the 4th kind of possible implementation of second aspect, in the 5th kind of possible implementation of second aspect In, second computing module further include:
Second computational submodule is configured as the first weight map mean value S calculated according to the mean value submodule1With The energy occupation ratio S that the occupation ratio submodule calculates2Using formula L2=S1*β+S2* (1- β) calculates image to be chosen In each image clarity weight map L2
Wherein, β is the weighted value of the first weight map mean value.
In conjunction with the 5th kind of possible implementation of second aspect, in the 6th kind of possible implementation of second aspect In, the selection module includes:
Computational submodule is chosen, the brightness evaluation value L calculated according to first computational submodule is configured as1With The clarity weight map L that second computational submodule calculates2Using formula L=L1*L2Calculate the Fusion Features value of image L。
In conjunction with the 6th kind of possible implementation of second aspect to second aspect, the 7th kind in second aspect is possible In implementation, described device further include:
Module is obtained, is configured as obtaining at least two images to be chosen.
The technical scheme provided by this disclosed embodiment can include the following benefits:
In one embodiment, due to calculating the luminance mean value wait choose each image in image, according to luminance mean value meter The brightness evaluation value of nomogram picture is calculated wait choose first weight map mean value of each image under time domain in image, calculate to Choose energy occupation ratio of each image under frequency domain in image, according to the first weight map mean value and energy occupation ratio calculate to The clarity weight map for choosing each image in image, the feature of image is determined according to brightness evaluation value and clarity weight map Fusion value is believed wait choose the maximum image of Fusion Features value in image as final selection image in conjunction with the brightness of image Breath and sharpness information, which are treated, to be chosen the quality of image and is ranked up, and selects the image of optimal quality, is realized under Same Scene, The image especially chosen under similar image is optimal image.
In another embodiment, due to using formula f ' (x, y)=log (f (x, y)+1) will be wait choose each in image Image f (x, y) is transformed into the space log, takes the mean value of image in the space log using formula v=exp (mean (f ' (x, y))) -1 Afterwards, the space log is gone into the value of luv space as luminance mean value v wait choose each image f ' (x, y) in image, according to Luminance mean value v uses formula L1The brightness evaluation value L of=1-abs (v-0.5) calculating image1, it is more in line with the vision spy of human eye Sign, the image luminance information of acquisition are more accurate.
In another embodiment, since at least one region unit will be divided into wait choose each image in image, use FormulaThe comprehensive difference value w (x) for calculating pixel in each region unit, will be maximum in region unit Comprehensive difference value of the comprehensive difference value as each region unit, the comprehensive difference value compositing area of all areas block The comprehensive difference diagram of block uses formula according to the comprehensive difference diagram of region unitCalculate the first weight map S, using formula S1=mean (S) calculates the first weight map mean value S1, realize the first weight map of image to be chosen under time domain Calculating, it is ensured that the counting accuracy of the first weight map, so improve image clarity weight map calculate accuracy.
In another embodiment, using comprehensive difference value maximum in region unit as the comprehensive difference of each region unit Before score value, according to maximum comprehensive difference value in the comprehensive difference value zoning block of pixel in each region unit, It realizes the calculating of maximum comprehensive difference value in region unit, and then ensures the counting accuracy of the first weight map.
In another embodiment, since each of image to be chosen image being gone to using formula F=fft (I) Each of image to be chosen in frequency domain image is filtered, using formula F by frequency domain1=abs (F) > T calculates filtering The energy value F for being greater than preset threshold wait choose the energy value F of each image in image afterwards1, using formulaCalculate to Choose energy occupation ratio S of each image under frequency domain in image2, realize that the energy of image to be chosen under frequency domain occupies The calculating of ratio, it is ensured that the counting accuracy of energy occupation ratio, and then the accuracy that the clarity weight map for improving image calculates.
In another embodiment, due to using formula L according to the first weight map mean value and energy occupation ratio2=S1*β+S2* (1- β) calculates the clarity weight map L wait choose each image in image2, β is the weighted value of the first weight map mean value, real The calculating of the clarity weight map of image now to be chosen, it is ensured that the counting accuracy of the clarity weight map of image, Jin Erti The accuracy that the Fusion Features value of hi-vision calculates.
In another embodiment, due to using formula L=L according to brightness evaluation value and clarity weight map1*L2Calculate figure The Fusion Features value L of picture, it is ensured that the counting accuracy of the Fusion Features value of image, it is ensured that select the image of optimal quality, realize To under Same Scene, the image especially chosen under similar image is optimal image.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not The disclosure can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows and meets implementation of the invention Example, and be used to explain the principle of the present invention together with specification.
Fig. 1 is a kind of flow chart of image choosing method shown according to an exemplary embodiment;
Fig. 2 is a kind of flow chart of the image choosing method shown according to another exemplary embodiment;
Fig. 3 is a kind of image selecting device block diagram shown according to an exemplary embodiment;
A kind of block diagram for image selecting device that Fig. 4 is shown according to another exemplary embodiment;
Fig. 5 is a kind of block diagram for image selecting device shown according to an exemplary embodiment.
Specific embodiment
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment Described in embodiment do not represent all embodiments consistented with the present invention.On the contrary, they be only with it is such as appended The example of device and method being described in detail in claims, some aspects of the invention are consistent.
Fig. 1 is a kind of flow chart of image choosing method shown according to an exemplary embodiment.As shown in Figure 1, this reality It applies the image choosing method that example is related to be used in terminal, which can be mobile phone, tablet computer and pen with camera function Remember this computer etc., is also possible to the other equipment with camera function, such as video camera.The image choosing method includes following step Suddenly.
In step s 11, the luminance mean value wait choose each image in image is calculated, image is calculated according to luminance mean value Brightness evaluation value.
Under normal conditions, under Same Scene, the conditions such as illumination locating for image taking are different, so that the brightness of image is not Together.In the embodiments of the present disclosure, terminal treats the brightness progress signature analysis for choosing image, calculates wait choose each in image The luminance mean value of image calculates the brightness evaluation value of image according to luminance mean value.Brightness evaluation value can preferably describe image Luminance information, can effectively reflect whether the light conditions of image, such as image occur deviating from the most adaptable range of human eye, There is the problems such as exposure, partially dark.
In step s 12, calculate wait choose first weight map mean value of each image under time domain in image, calculate to Choose energy occupation ratio of each image under frequency domain in image, according to the first weight map mean value and energy occupation ratio calculate to Choose the clarity weight map of each image in image.
Under normal conditions, under Same Scene, the conditions such as focusing when image taking are different, so that the clarity of image is not Together.In the embodiments of the present disclosure, terminal carries out signature analysis to image to be determined under time domain, calculates image to be determined and exists The first weight map mean value under time domain.The first weight map mean value under time domain can preferably describe the regional area of image, The marginal texture information of image can be described effectively.Terminal carries out signature analysis to image to be determined under frequency domain, calculates The energy occupation ratio of image to be determined under frequency domain.Energy occupation ratio under frequency domain can preferably describe the entirety of image Region can effectively describe the Global Information of image including some regions compared with low contrast.According to the first power under time domain Energy occupation ratio under multigraph mean value and frequency domain calculates the clarity weight map of image, the time-domain information and figure of terminal combination image The frequency domain information of picture can preferably describe the clarity of image, and the time domain of image considers the part of image, and the frequency domain of image is examined Consider the entirety of image, can effectively react the clarity situation of image, such as image with the presence or absence of out of focus, motion blur etc. The presence of situation.
In step s 13, the Fusion Features value that image is determined according to brightness evaluation value and clarity weight map, will be wait choose The maximum image of Fusion Features value is as final selection image in image.
In the embodiment of the present disclosure, in conjunction with image luminance information and sharpness information treat choose image quality commented Valence, terminal determine the Fusion Features value of image according to brightness evaluation value and clarity weight map, will melt wait choose feature in image Conjunction is worth maximum image as final selection image, can be under Same Scene, especially in similar pictures, in figure to be chosen Optimal image is chosen as in, can be the optimal quality of the prompt of user's intelligence any picture, intelligent deletes figure to be chosen Remaining picture as in.
The image choosing method of the present embodiment, by calculating the luminance mean value wait choose each image in image, according to Luminance mean value calculates the brightness evaluation value of image, calculates equal wait choose first weight map of each image under time domain in image Value, calculates wait choose energy occupation ratio of each image under frequency domain in image, is accounted for according to the first weight map mean value and energy Have than calculating the clarity weight map wait choose each image in image, is determined according to brightness evaluation value and clarity weight map The Fusion Features value of image, using wait choose, the maximum image of Fusion Features value is as final selection image in image, in conjunction with figure The quality that the luminance information and sharpness information of picture treat selection image is ranked up, and selects the image of optimal quality, realization pair Under Same Scene, the image especially chosen under similar image is optimal image.
Fig. 2 is a kind of flow chart of the image choosing method shown according to another exemplary embodiment.The present embodiment is related to Image choosing method on the basis of the present embodiment embodiment shown in Fig. 1, calculating the brightness evaluation value of image in terminal With the clarity weight map of image, and the reality of the Fusion Features value of image is determined according to brightness evaluation value and clarity weight map Example is applied, is elaborated, as shown in Fig. 2, the image choosing method includes the following steps.
In the step s 21, at least two images to be chosen are obtained.
In the embodiment of the present disclosure, terminal can be shot by camera and obtain image to be chosen, can also be by depositing The image that in the terminal to be chosen is stored in advance is obtained in storage module.Image to be chosen be shot under Same Scene it is more Open the similar image of content.
In step S22, it will be transformed into the space log wait choose each image in image, and take the equal of image in the space log After value, the space log is gone into the value of luv space as luminance mean value wait choose each image in image, it is equal according to brightness Value uses formula L1The brightness evaluation value of=1-abs (v-0.5) calculating image.Execute step S26.
Wherein, it will be converted wait choose each image f (x, y) in image using formula f ' (x, y)=log (f (x, y)+1) To the space log;Using formula v=exp (mean (f ' (x, y))) -1 after the space log takes the mean value of image, by the space log The value of luv space is gone to as luminance mean value v wait choose each image f ' (x, y) in image, is used according to luminance mean value v Formula L1The brightness evaluation value L of=1-abs (v-0.5) calculating image1.Exp (mean (f ' (x, y))) is indicated with natural constant e For bottom, the exponential function that mean (f ' (x, y)) is index, mean (f ' (x, y)) is indicated to the space log wait choose in image Each image f ' (x, y) is averaged;Abs (v-0.5) indicates that the difference to the luminance mean value and datum mark 0.5 takes absolutely Value.
In the embodiment of the present disclosure, it will be taken to the space log in the space log wait choose each image Mapping and Converting in image Then the mean value of image is inverted back again as the luminance mean value calculated, be more in line with the visual signature of human eye, the image of acquisition Luminance information is more accurate.Meanwhile the interpretational criteria of brightness of image uses formula L1=1-abs (v-0.5) calculates the bright of image Spend evaluation of estimate, by the luminance mean value of image compared with datum mark 0.5, judge present intensity mean value at a distance from datum mark 0.5, if Present intensity mean value bigger namely image brightness evaluation value L at a distance from datum mark 0.51It is smaller, indicate that image exposes Or dark images;If present intensity mean value smaller namely image brightness evaluation value L at a distance from datum mark 0.51It is bigger, Indicate that exposing does not occur in image or image is not partially dark.
In step S23, it will be divided at least one region unit wait choose each image in image, and calculate each area The comprehensive difference value of pixel in the block of domain, using comprehensive difference value maximum in region unit as the comprehensive of each region unit Difference value, the comprehensive difference diagram of the comprehensive difference value compositing area block of all areas block, according to the comprehensive difference of region unit Component calculates the first weight map, using formula S1=mean (S) calculates the first weight map mean value.Execute step S25.
Wherein, using formulaCalculate the comprehensive difference value w of pixel in each region unit (x); Xi、XjIndicate that, wait choose the pixel value in image in each image-region block, Ω is indicated wait choose each in image The image region that all areas block forms under time domain, i, j are respectively more than or equal to 1 positive integer, and i ≠ j.According to region unit Comprehensive difference diagram use formulaCalculate the first weight map S;Using formula S1=mean (S) calculates the One weight map mean value S1Indicate the comprehensive difference value v (Ω to pixel in region unit1) maximizing, mean (S) it indicates to average to the first weight map S.
In the embodiment of the present disclosure, firstly, each image in image to be chosen is divided into several region units by terminal, Each region unit includes several pixels, terminal can according to the pixel value of pixel in region unit, computation partition each The comprehensive difference value of predeterminated position pixel in region unit.For example, if in each region unit including 8*8 (64) a pixel, The comprehensive difference value on the right, 7*7 (49) a pixel that following and lower right surrounds can be calculated.In the embodiment of the present disclosure, lead to The comprehensive difference value that terminal calculates predeterminated position pixel in each region unit is crossed, the side in each region unit can be removed Boundary's point, it is ensured that the accuracy of the comprehensive difference value of each region unit pixel.Secondly, each region unit pixel is comprehensive Difference value represents the difference of pixel and surrounding pixel, and difference shows that more greatly contrast is relatively high in the region unit where the pixel. Terminal is using comprehensive difference value maximum in region unit as the comprehensive difference value of each region unit, so that each region unit In comprehensive difference value it is identical, it can be ensured that in each region unit of image to be determined pixel contrast difference compared with Greatly, and each region unit has preferable contrast.For example, passing through public affairs if image to be determined is divided into 4 region units FormulaThe comprehensive difference value for calculating each region unit pixel is respectively as follows: 0.2,0.3,0.5 and 0.8, then using maximum comprehensive difference value 0.8 as the comprehensive difference value of this 4 region units, i.e. each region unit Comprehensive difference value be all 0.8.Again, comprehensive difference diagram indicates first weight map of the image under time domain, region unit There are a corresponding relationships with the first weight map for comprehensive difference diagram, and terminal is according to the comprehensive difference diagram of region unit using public FormulaThe first weight map S is calculated, using formula S1=mean (S) calculates the first weight map mean value S1, the first power Multigraph mean value S1For describing clarity of the image under time domain, S1The bigger clarity for indicating image is higher, S1Smaller expression figure The clarity of picture is lower.
Further, in step 23, using comprehensive difference value maximum in region unit as the complete of each region unit Before orientation difference value, further includes: according to maximum in the comprehensive difference value zoning block of pixel in each region unit Comprehensive difference value.
In the embodiment of the present disclosure, according to the comprehensive difference value of predeterminated position pixel in each region unit, it can pass through Comparison method determines maximum comprehensive difference value in region unit one by one;Can also by the block of maximizing function zoning most Big comprehensive difference value, the embodiment of the present disclosure is herein without limiting and repeating.
In step s 24, each of image to be chosen image is gone into frequency domain, by image to be chosen in frequency domain Each of image be filtered, calculate and be greater than preset threshold wait choose the energy value of each image in image after filtering Energy value, using formulaIt calculates wait choose energy occupation ratio of each image under frequency domain in image.Execute step S25。
Wherein, each of image to be chosen image I is gone to by frequency domain using formula F=fft (I).Using formula F1=abs (F) > T calculates the energy value F for being greater than preset threshold T wait choose the energy value F of each image in image after filtering1; Wherein, F=fft (I) expression treats each of image of selection image I and carries out Fourier transformation;Abs (F) is indicated to filter Wait choose the energy value F modulus value of each image in image, preset threshold T=5 after wave.
In the embodiment of the present disclosure, firstly, terminal uses formula F=fft (I) by each of image to be chosen image Fourier transformation is carried out, image to be chosen is transformed from the time domain into frequency domain.Secondly, each of image to be chosen image Fu In leaf transformation go to after frequency domain, do primary filtering processing in frequency domain, calculate after filtering wait choose each image in image Modulus value be greater than preset threshold energy value, using formulaIt calculates wait choose in image each image under frequency domain Energy occupation ratio S2, energy occupation ratio S2It is bigger, indicate that image is more clear, the clarity of image is higher.
In step s 25, formula L is used according to the first weight map mean value and energy occupation ratio2=S1*β+S2* (1- β) is counted Calculate the clarity weight map wait choose each image in image.
Wherein, β is the weighted value of the first weight map mean value.
In the embodiment of the present disclosure, in conjunction with the time domain and frequency domain of image, the Weighted Fusion of time-domain and frequency-domain is taken, terminal is according to step The energy occupation ratio calculated in the first weight map mean value and step S24 calculated in rapid S23 calculates the clarity weight of image Figure, the clarity weight map L of image2Bigger, the clarity of image is higher, the clarity weight map L of image2It is smaller, image it is clear Clear degree is lower.
In step S26, formula L=L is used according to brightness evaluation value and clarity weight map1*L2Calculate the feature of image Fusion value.
In the embodiment of the present disclosure, in conjunction with the brightness and clarity feature of image, take brightness and clarity special The fusion of sign, terminal is according to the clarity weight map calculating figure calculated in the brightness evaluation value and step S25 calculated in step S22 The Fusion Features value L of the Fusion Features value of picture, image is bigger, and the comprehensive characteristics of image are better, and the quality of image is more excellent.
It in step s 27, will be wait choose in image the maximum image of Fusion Features value as final selection image.
In the embodiment of the present disclosure, in conjunction with image luminance information and sharpness information treat choose image quality commented Valence, terminal is according to brightness evaluation value and clarity weight map formula L=L1*L2Determine the Fusion Features value of image, it will figure be chosen The maximum image of Fusion Features value is as final selection image as in, can under Same Scene, especially in similar pictures, Optimal image is being chosen in image wait choose, and can be the optimal quality of the prompt of user's intelligence any picture, intelligent deletes Except wait choose remaining picture in image.
The image choosing method of the present embodiment, by calculating the luminance mean value wait choose each image in image, according to Luminance mean value calculates the brightness evaluation value of image, calculates equal wait choose first weight map of each image under time domain in image Value, calculates wait choose energy occupation ratio of each image under frequency domain in image, is accounted for according to the first weight map mean value and energy Have than calculating the clarity weight map wait choose each image in image, is determined according to brightness evaluation value and clarity weight map The Fusion Features value of image, using wait choose, the maximum image of Fusion Features value is as final selection image in image, in conjunction with figure The quality that the luminance information and sharpness information of picture treat selection image is ranked up, and selects the image of optimal quality, realization pair Under Same Scene, the image especially chosen under similar image is optimal image.Meanwhile by will be each in image wait choose A image is transformed into the space log, after the space log takes the mean value of image, by the space log wait choose each image in image The value of luv space is gone to as luminance mean value, formula L is used according to luminance mean value1=1-abs (v-0.5) calculates the bright of image Evaluation of estimate is spent, is more in line with the visual signature of human eye, the image luminance information of acquisition is more accurate.In addition, by will be wait choose Each image is divided at least one region unit in image, calculates the comprehensive difference value of pixel in each region unit, will Comprehensive difference value of the maximum comprehensive difference value as each region unit in region unit, the comprehensive difference of all areas block The comprehensive difference diagram of score value compositing area block calculates the first weight map according to the comprehensive difference diagram of region unit, using formula S1=mean (S) calculates the first weight map mean value, realizes the calculating of the first weight map of image to be chosen under time domain, it is ensured that The counting accuracy of first weight map, and then the accuracy that the clarity weight map for improving image calculates.Meanwhile by will be to be selected Each of image taken image goes to frequency domain, and each of image to be chosen in frequency domain image is filtered, meter The energy value for being greater than preset threshold wait choose the energy value of each image in image after filtering is calculated, using formulaMeter It calculates wait choose energy occupation ratio of each image under frequency domain in image, realizes that the energy of image to be chosen under frequency domain accounts for There is the calculating of ratio, it is ensured that the counting accuracy of energy occupation ratio, and then the accuracy that the clarity weight map for improving image calculates.
Fig. 3 is a kind of image selecting device block diagram shown according to an exemplary embodiment.Referring to Fig. 3, which includes: First computing module 31, the second computing module 32 and selection module 33.
First computing module 31 is configured as calculating the luminance mean value wait choose each image in image, according to brightness Mean value determines the brightness evaluation value of image.
Second computing module 32 is configured as calculating wait choose first weight of each image under time domain in image Figure mean value is calculated wait choose energy occupation ratio of each image under frequency domain in image, according to the first weight map mean value and energy Amount occupation ratio calculates the clarity weight map wait choose each image in image.
The selection module 33 is configured as the brightness evaluation value calculated according to the first computing module 31 and the second computing module The 32 clarity weight maps calculated determine the Fusion Features value of image, will be wait choose the maximum image of Fusion Features value in image As final selection image.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method Embodiment in be described in detail, no detailed explanation will be given here.
The image selecting device of the present embodiment, by calculating the luminance mean value wait choose each image in image, according to Luminance mean value calculates the brightness evaluation value of image, calculates equal wait choose first weight map of each image under time domain in image Value, calculates wait choose energy occupation ratio of each image under frequency domain in image, is accounted for according to the first weight map mean value and energy Have than calculating the clarity weight map wait choose each image in image, is determined according to brightness evaluation value and clarity weight map The Fusion Features value of image, using wait choose, the maximum image of Fusion Features value is as final selection image in image, in conjunction with figure The quality that the luminance information and sharpness information of picture treat selection image is ranked up, and selects the image of optimal quality, realization pair Under Same Scene, the image especially chosen under similar image is optimal image.
A kind of block diagram for image selecting device that Fig. 4 is shown according to another exemplary embodiment.It is shown in Fig. 3 referring to Fig. 4 On the basis of embodiment, the device further include: obtain module 34.
The acquisition module 34 is configured as obtaining at least two images to be chosen.
First computing module 31 includes: that space transform subblock 311, first determines that submodule 312 and first calculates submodule Block 313.
The space transform subblock 311 is configured as will image be chosen using formula f ' (x, y)=log (f (x, y)+1) In each image f (x, y) be transformed into the space log.
The first determining submodule 312 is configured as converting using formula v=exp (mean (f ' (x, y))) -1 in space After the space log that submodule 311 is converted takes the mean value of image, by the space log wait choose each image f ' (x, y) in image The value of luv space is gone to as luminance mean value v.
Wherein, exp (mean (f ' (x, y))) is indicated using natural constant e as bottom, the index that mean (f ' (x, y)) is index Function, mean (f ' (x, y)) indicate being averaged wait choose each image f ' (x, y) in image to the space log.
First computational submodule 313 is configured as determining that the luminance mean value v that submodule 312 determines uses public affairs according to first Formula L1The brightness evaluation value L of=1-abs (v-0.5) calculating image1
Wherein, abs (v-0.5) expression takes absolute value to the difference of the luminance mean value and datum mark 0.5.
Second computing module 32 includes: piecemeal submodule 3211, the determining submodule of piecemeal computational submodule 3212, second 3213, weight map computational submodule 3214 and mean value computation submodule 3215.
The piecemeal submodule 3211 is configured as that at least one region unit will be divided into wait choose each image in image.
The piecemeal computational submodule 3212 is configured as using formulaCalculate piecemeal submodule The comprehensive difference value w (x) of pixel in 3211 each region unit divided.
Wherein, Xi、XjIndicate that, wait choose the pixel value in image in each image-region block, Ω indicates image to be chosen In each image region that all areas block forms under time domain, i, j are respectively the positive integer for being more than or equal to 1, and i ≠ j.
The second determining submodule 3213 is configured as using comprehensive difference value maximum in region unit as each area The comprehensive difference value of domain block, the comprehensive difference diagram of the comprehensive difference value compositing area block of all areas block.
The weight map computational submodule 3214 is configured as determining the complete of the determining region unit of submodule 3213 according to second Orientation difference diagram uses formulaCalculate the first weight map S.
Wherein,Indicate the comprehensive difference value v (Ω to pixel in region unit1) maximizing.
The mean value computation submodule 3215 is configured as the first weight map S calculated according to weight map computational submodule 3214 Using formula S1=mean (S) calculates the first weight map mean value S1
Wherein, mean (S) expression averages to the first weight map S.
Further, the second computing module 32 further includes;Maximum value calculation submodule.
The maximum value calculation submodule is configured as second and determines submodule 3213 by comprehensive difference maximum in region unit Before score value is as the comprehensive difference value of each region unit, according to the comprehensive difference value meter of pixel in each region unit Calculate maximum comprehensive difference value in region unit.
Further, the second computing module further include: frequency-domain transform submodule 3221, filtering submodule 3222, energy value Computational submodule 3223 and occupation ratio computational submodule 3224.
The frequency-domain transform submodule 3221 is configured as using formula F=fft (I) by each of image to be chosen Image I goes to frequency domain.
Wherein, fft (I) expression treats each of image of selection image I and carries out Fourier transformation.
The filtering submodule 3222 is configured as image to be chosen in the frequency domain for converting frequency-domain transform submodule 3221 Each of image be filtered.
The energy value computational submodule 3223 is configured as using formula F1=abs (F) > T calculates filtering submodule 3222 The energy value F for being greater than preset threshold T wait choose the energy value F of each image in image after filtering1
Wherein, abs (F) is indicated to the energy value F modulus value after filtering wait choose each image in image, preset threshold T=5.
The occupation ratio computational submodule 3224 is configured as the energy value F calculated according to energy value submodule 32231Using FormulaIt calculates wait choose energy occupation ratio S of each image under frequency domain in image2
Further, the second computing module further include: the second computational submodule 323.
Second computational submodule 323 is configured as the first weight map mean value S calculated according to mean value submodule 32151With The energy occupation ratio S that occupation ratio submodule 3224 calculates2Using formula L2=S1*β+S2* (1- β) calculates every in image wait choose The clarity weight map L of one image2
Wherein, β is the weighted value of the first weight map mean value.
Further, choosing module 33 includes: to choose computational submodule 331.
The selection computational submodule 331 is configured as the brightness evaluation value L calculated according to the first computational submodule 3131With The clarity weight map L that second computational submodule 323 calculates2Using formula L=L1*L2Calculate the Fusion Features value L of image.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method Embodiment in be described in detail, no detailed explanation will be given here.
The image selecting device of the present embodiment, by calculating the luminance mean value wait choose each image in image, according to Luminance mean value calculates the brightness evaluation value of image, calculates equal wait choose first weight map of each image under time domain in image Value, calculates wait choose energy occupation ratio of each image under frequency domain in image, is accounted for according to the first weight map mean value and energy Have than calculating the clarity weight map wait choose each image in image, is determined according to brightness evaluation value and clarity weight map The Fusion Features value of image, using wait choose, the maximum image of Fusion Features value is as final selection image in image, in conjunction with figure The quality that the luminance information and sharpness information of picture treat selection image is ranked up, and selects the image of optimal quality, realization pair Under Same Scene, the image especially chosen under similar image is optimal image.Meanwhile by will be each in image wait choose A image is transformed into the space log, after the space log takes the mean value of image, by the space log wait choose each image in image The value of luv space is gone to as luminance mean value, formula L is used according to luminance mean value1=1-abs (v-0.5) calculates the bright of image Evaluation of estimate is spent, is more in line with the visual signature of human eye, the image luminance information of acquisition is more accurate.In addition, by will be wait choose Each image is divided at least one region unit in image, calculates the comprehensive difference value of pixel in each region unit, will Comprehensive difference value of the maximum comprehensive difference value as each region unit in region unit, the comprehensive difference of all areas block The comprehensive difference diagram of score value compositing area block calculates the first weight map according to the comprehensive difference diagram of region unit, using formula S1=mean (S) calculates the first weight map mean value, realizes the calculating of the first weight map of image to be chosen under time domain, it is ensured that The counting accuracy of first weight map, and then the accuracy that the clarity weight map for improving image calculates.Meanwhile by will be to be selected Each of image taken image goes to frequency domain, and each of image to be chosen in frequency domain image is filtered, meter The energy value for being greater than preset threshold wait choose the energy value of each image in image after filtering is calculated, using formulaMeter It calculates wait choose energy occupation ratio of each image under frequency domain in image, realizes that the energy of image to be chosen under frequency domain accounts for There is the calculating of ratio, it is ensured that the counting accuracy of energy occupation ratio, and then the accuracy that the clarity weight map for improving image calculates.
Fig. 5 is a kind of block diagram for image selecting device shown according to an exemplary embodiment.For example, device 800 It can be mobile phone, computer, digital broadcasting terminal, messaging device, game console, tablet device, Medical Devices, Body-building equipment, personal digital assistant etc..
Referring to Fig. 5, device 800 may include following one or more components: processing component 802, memory 804, power supply Component 806, multimedia component 808, audio component 810, the interface 812 of input/output (I/O), sensor module 814, and Communication component 816.
The integrated operation of the usual control device 800 of processing component 802, such as with display, telephone call, data communication, phase Machine operation and record operate associated operation.Processing component 802 may include that one or more processors 820 refer to execute It enables, to perform all or part of the steps of the methods described above.In addition, processing component 802 may include one or more modules, just Interaction between processing component 802 and other assemblies.For example, processing component 802 may include multi-media module, it is more to facilitate Interaction between media component 808 and processing component 802.
Memory 804 is configured as storing various types of data to support the operation in equipment 800.These data are shown Example includes the instruction of any application or method for operating on device 800, contact data, and telephone book data disappears Breath, picture, video etc..Memory 804 can be by any kind of volatibility or non-volatile memory device or their group It closes and realizes, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM) is erasable to compile Journey read-only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash Device, disk or CD.
Power supply module 806 provides electric power for the various assemblies of device 800.Power supply module 806 may include power management system System, one or more power supplys and other with for device 800 generate, manage, and distribute the associated component of electric power.
Multimedia component 808 includes the screen of one output interface of offer between device 800 and user.In some realities It applies in example, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen can To be implemented as touch screen, to receive input signal from the user.Touch panel include one or more touch sensors with Sense the gesture on touch, slide, and touch panel.Touch sensor can not only sense the boundary of a touch or slide action, and And also detect duration and pressure relevant to touch or slide.In some embodiments, multimedia component 808 includes One front camera and/or rear camera.It is such as in a shooting mode or a video mode, preceding when equipment 800 is in operation mode It sets camera and/or rear camera can receive external multi-medium data.Each front camera and rear camera can Be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 810 is configured as output and/or input audio signal.For example, audio component 810 includes a Mike Wind (MIC), when device 800 is in operation mode, when such as call mode, recording mode, and voice recognition mode, microphone is matched It is set to reception external audio signal.The received audio signal can be further stored in memory 804 or via communication set Part 816 is sent.In some embodiments, audio component 810 further includes a loudspeaker, is used for output audio signal.
I/O interface 812 provides interface between processing component 802 and peripheral interface module, and above-mentioned peripheral interface module can To be keyboard, click wheel, button etc..These buttons may include, but are not limited to: home button, volume button, start button and lock Determine button.
Sensor module 814 includes one or more sensors, and the state for providing various aspects for device 800 is commented Estimate.For example, sensor module 814 can detecte the state that opens/closes of equipment 800, the relative positioning of component, such as component For the display and keypad of device 800, sensor module 814 can be with the position of 800 1 components of detection device 800 or device Set change, the existence or non-existence that user contacts with device 800, the temperature in 800 orientation of device or acceleration/deceleration and device 800 Variation.Sensor module 814 may include proximity sensor, be configured to detect without any physical contact near The presence of object.Sensor module 814 can also include optical sensor, such as CMOS or ccd image sensor, for answering in imaging With middle use.In some embodiments, which can also include acceleration transducer, gyro sensor, magnetic Sensor, pressure sensor or temperature sensor.
Communication component 816 is configured to facilitate the communication of wired or wireless way between device 800 and other equipment.Device 800 can access the wireless network based on communication standard, such as WiFi, 2G or 3G or their combination.In an exemplary implementation In example, communication component 816 receives broadcast singal or broadcast related information from external broadcasting management system via broadcast channel. In one exemplary embodiment, communication component 816 further includes near-field communication (NFC) module, to promote short range communication.For example, Radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band (UWB) technology, bluetooth can be based in NFC module (BT) technology and other technologies are realized.
In the exemplary embodiment, device 800 can be believed by one or more application specific integrated circuit (ASIC), number Number processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided It such as include the memory 804 of instruction, above-metioned instruction can be executed by the processor 820 of device 800 to complete the above method.For example, Non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and light Data storage device etc..
A kind of non-transitorycomputer readable storage medium, when the instruction in storage medium is held by the processor of mobile terminal When row, so that mobile terminal is able to carry out a kind of image choosing method, this method comprises:
The luminance mean value wait choose each image in image is calculated, the brightness evaluation of image is calculated according to luminance mean value Value;
It calculates wait choose first weight map mean value of each image under time domain in image, calculates every in image wait choose Energy occupation ratio of one image under frequency domain calculates every in image wait choose according to the first weight map mean value and energy occupation ratio The clarity weight map of one image;
The Fusion Features value of image is determined according to brightness evaluation value and clarity weight map, will be melted wait choose feature in image Conjunction is worth maximum image as final selection image.
Wherein, the luminance mean value wait choose each image in image is calculated, the brightness of image is calculated according to luminance mean value Evaluation of estimate, comprising:
Log will be transformed into wait choose each image f (x, y) in image using formula f ' (x, y)=log (f (x, y)+1) Space;
Using formula v=exp (mean (f ' (x, y))) -1 after the space log takes the mean value of image, by the space log to It chooses each image f ' (x, y) in image and goes to the value of luv space as luminance mean value v;
Formula L is used according to luminance mean value v1The brightness evaluation value L of=1-abs (v-0.5) calculating image1
Wherein, exp (mean (f ' (x, y))) is indicated using natural constant e as bottom, the index that mean (f ' (x, y)) is index Function, mean (f ' (x, y)) indicate being averaged wait choose each image f ' (x, y) in image to the space log;abs (v-0.5) it indicates to take absolute value to the difference of the luminance mean value and datum mark 0.5.
Wherein, it calculates wait choose first weight map mean value of each image under time domain in image, comprising:
At least one region unit will be divided into wait choose each image in image;
Using formulaCalculate the comprehensive difference value w (x) of pixel in each region unit;
Wherein, Xi、XjIndicate that, wait choose the pixel value in image in each image-region block, Ω indicates image to be chosen In each image region that all areas block forms under time domain, i, j are respectively the positive integer for being more than or equal to 1, and i ≠ j;
Using comprehensive difference value maximum in region unit as the comprehensive difference value of each region unit, all areas block Comprehensive difference value compositing area block comprehensive difference diagram;
Formula is used according to the comprehensive difference diagram of region unitCalculate the first weight map S;
Using formula S1=mean (S) calculates the first weight map mean value S1
Wherein,Indicate the comprehensive difference value v (Ω to pixel in region unit1) maximizing, mean (S) Expression averages to the first weight map S.
Wherein, before using comprehensive difference value maximum in region unit as the comprehensive difference value of each region unit, Further include:
According to maximum comprehensive difference value in the comprehensive difference value zoning block of pixel in each region unit.
Wherein, it calculates wait choose energy occupation ratio of each image under frequency domain in image, comprising:
Each of image to be chosen image I is gone to by frequency domain using formula F=fft (I);
Wherein, fft (I) expression treats each of image of selection image I and carries out Fourier transformation;
Each of image to be chosen in frequency domain image is filtered;
Using formula F1=abs (F) > T is greater than in advance after calculating filtering wait choose the energy value F of each image in image If the energy value F of threshold value T1
Wherein, abs (F) is indicated to the energy value F modulus value after filtering wait choose each image in image, preset threshold T=5;
Using formulaIt calculates wait choose energy occupation ratio S of each image under frequency domain in image2
Wherein, the clarity wait choose each image in image is calculated according to the first weight map mean value and energy occupation ratio Weight map, comprising:
According to the first weight map mean value S1With energy occupation ratio S2Using formula L2=S1*β+S2* (1- β) calculates figure to be chosen The clarity weight map L of each image as in2
Wherein, β is the weighted value of the first weight map mean value.
Wherein, the Fusion Features value of image is determined according to brightness evaluation value and clarity weight map, comprising:
According to brightness evaluation value L1With clarity weight map L2Using formula L=L1*L2Calculate the Fusion Features value L of image.
Wherein, it calculates wait choose in image before the luminance mean value of each image, further includes:
Obtain at least two images to be chosen.
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to of the invention its Its embodiment.This application is intended to cover any variations, uses, or adaptations of the invention, these modifications, purposes or Person's adaptive change follows general principle of the invention and including the undocumented common knowledge in the art of the disclosure Or conventional techniques.The description and examples are only to be considered as illustrative, and true scope and spirit of the invention are by following Claim is pointed out.
It should be understood that the present invention is not limited to the precise structure already described above and shown in the accompanying drawings, and And various modifications and changes may be made without departing from the scope thereof.The scope of the present invention is limited only by the attached claims.

Claims (14)

1. a kind of image choosing method characterized by comprising
The luminance mean value wait choose each image in image is calculated, the brightness evaluation of image is calculated according to the luminance mean value Value;
The image for treating selection carries out signature analysis under time domain, calculate wait choose each image in image under time domain One weight map mean value, the image for treating selection carry out signature analysis under frequency domain, calculate and exist wait choose each image in image Energy occupation ratio under frequency domain is calculated according to the first weight map mean value under time domain and the energy occupation ratio under frequency domain Wait choose the clarity weight map of each image in image;
The Fusion Features value of image is determined according to the brightness evaluation value and the clarity weight map, it will be special in image wait choose Sign fusion is worth maximum image as final selection image,
It is described to calculate wait choose first weight map mean value of each image under time domain in image, comprising:
Formula is used according to the comprehensive difference diagram of region unitCalculate the first weight map S;
Using formula S1=mean (S) calculates the first weight map mean value S1
Wherein,Indicate the comprehensive difference value w (Ω to pixel in region unit1) maximizing, mean (S) expression pair The first weight map S averages,
It is described to calculate wait choose energy occupation ratio of each image under frequency domain in image, comprising:
Using formula F1It is greater than preset threshold T wait choose the energy value F of each image in image after=abs (F) > T calculating filtering Energy value F1
Wherein, abs (F) is indicated to the energy value F modulus value after filtering wait choose each image in image, preset threshold T= 5;
Using formulaIt calculates wait choose energy occupation ratio S of each image under frequency domain in image2,
It is described to be calculated according to the first weight map mean value and the energy occupation ratio wait choose the clear of each image in image Clear degree weight map, comprising:
According to the first weight map mean value S1With the energy occupation ratio S2Using formula L2=S1*β+S2* (1- β) is calculated to be selected Take the clarity weight map L of each image in image2
Wherein, β is the weighted value of the first weight map mean value.
2. the method according to claim 1, wherein the brightness calculated wait choose each image in image Mean value calculates the brightness evaluation value of image according to the luminance mean value, comprising:
Log sky will be transformed into wait choose each image f (x, y) in image using formula f ' (x, y)=log (f (x, y)+1) Between;
Using formula v=exp (mean (f ' (x, y))) -1 after the space log takes the mean value of image, by the space log wait choose Each image f ' (x, y) goes to the value of luv space as the luminance mean value v in image;
Formula L is used according to the luminance mean value v1The brightness evaluation value L of=1-abs (v-0.5) calculating image1
Wherein, exp (mean (f ' (x, y))) is indicated using natural constant e as bottom, the exponential function that mean (f ' (x, y)) is index, Mean (f ' (x, y)) indicates being averaged wait choose each image f ' (x, y) in image to the space log;abs(v-0.5) Expression takes absolute value to the difference of the luminance mean value and datum mark 0.5.
3. according to the method described in claim 2, it is characterized in that, described calculate wait choose in image each image in time domain Under the first weight map mean value, further includes:
At least one region unit will be divided into wait choose each image in image;
Using formulaCalculate the comprehensive difference value w (x) of pixel in each region unit;
Wherein, Xi、XjIndicate described wait choose the pixel value in image in each image-region block, Ω indicates described wait choose Each image region that all areas block forms under time domain in image, i, j are respectively the positive integer for being more than or equal to 1, and i ≠ j;
Using comprehensive difference value maximum in region unit as the comprehensive difference value of each region unit, all areas block it is complete The comprehensive difference diagram of orientation difference value compositing area block.
4. according to the method described in claim 3, it is characterized in that, it is described using comprehensive difference value maximum in region unit as Before the comprehensive difference value of each region unit, further includes:
According to maximum comprehensive difference value in the comprehensive difference value zoning block of pixel in each region unit.
5. according to the method described in claim 4, it is characterized in that, described calculate wait choose in image each image in frequency domain Under energy occupation ratio, further includes:
Each of image to be chosen image I is gone to by frequency domain using formula F=fft (I);
Wherein, fft (I) expression treats each of image of selection image I and carries out Fourier transformation;
Each of image to be chosen in frequency domain image is filtered.
6. the method according to claim 1, wherein described weigh according to the brightness evaluation value and the clarity Multigraph determines the Fusion Features value of image, comprising:
According to the brightness evaluation value L1With the clarity weight map L2Using formula L=L1*L2Calculate the Fusion Features of image Value L.
7. method according to claim 1-6, which is characterized in that described to calculate wait choose each figure in image Before the luminance mean value of picture, further includes:
Obtain at least two images to be chosen.
8. a kind of image selecting device characterized by comprising
First computing module is configured as calculating the luminance mean value wait choose each image in image, equal according to the brightness It is worth the brightness evaluation value for determining image;
Second computing module, the image for being configured as treating selection carry out signature analysis under time domain, calculate wait choose in image First weight map mean value of each image under time domain, the image for treating selection carry out signature analysis under frequency domain, calculate to Energy occupation ratio of each image under frequency domain in image is chosen, according to the first weight map mean value and frequency domain under time domain Under the energy occupation ratio calculate the clarity weight map wait choose each image in image;
Module is chosen, the brightness evaluation value and described second for being configured as calculating according to first computing module calculate mould The clarity weight map that block calculates determines the Fusion Features value of image, will be wait choose the maximum figure of Fusion Features value in image As final selection image,
Second computing module, comprising:
Weight map computational submodule is configured as the comprehensive difference diagram for determining the determining region unit of submodule according to described second Using formulaCalculate the first weight map S;
Mean value computation submodule is configured as being used according to the first weight map S that the weight map computational submodule calculates Formula S1=mean (S) calculates the first weight map mean value S1
Wherein,Indicate the comprehensive difference value w (Ω to pixel in region unit1) maximizing, mean (S) expression pair The first weight map S averages;
Energy value computational submodule is configured as using formula F1=abs (F) > T calculates to be selected after the filtering submodule filters The energy value F of each image in image is taken to be greater than the energy value F of preset threshold T1
Wherein, abs (F) is indicated to the energy value F modulus value after filtering wait choose each image in image, preset threshold T= 5;
Occupation ratio computational submodule is configured as the energy value F calculated according to the energy value computational submodule1Using formulaIt calculates wait choose energy occupation ratio S of each image under frequency domain in image2
Second computational submodule is configured as the first weight map mean value S calculated according to the mean value computation submodule1With The energy occupation ratio S that the occupation ratio computational submodule calculates2Using formula L2=S1*β+S2* (1- β) is calculated wait choose The clarity weight map L of each image in image2
Wherein, β is the weighted value of the first weight map mean value.
9. device according to claim 8, which is characterized in that first computing module includes:
Space transform subblock, being configured as will be wait choose each in image using formula f ' (x, y)=log (f (x, y)+1) Image f (x, y) is transformed into the space log;
First determines submodule, is configured as converting submodule in the space using formula v=exp (mean (f ' (x, y))) -1 After the space log of block conversion takes the mean value of image, the space log is gone into original wait choose each image f ' (x, y) in image The value in beginning space is as the luminance mean value v;
First computational submodule is configured as determining that the luminance mean value v that submodule determines uses formula L according to described first1 The brightness evaluation value L of=1-abs (v-0.5) calculating image1
Wherein, exp (mean (f ' (x, y))) is indicated using natural constant e as bottom, the exponential function that mean (f ' (x, y)) is index, Mean (f ' (x, y)) indicates being averaged wait choose each image f ' (x, y) in image to the space log;abs(v-0.5) Expression takes absolute value to the difference of the luminance mean value and datum mark 0.5.
10. device according to claim 9, which is characterized in that second computing module further include:
Piecemeal submodule is configured as that at least one region unit will be divided into wait choose each image in image;
Piecemeal computational submodule is configured as using formulaCalculate the every of the piecemeal submodule division The comprehensive difference value w (x) of pixel in one region unit;
Wherein, Xi、XjIndicate described wait choose the pixel value in image in each image-region block, Ω indicates described wait choose Each image region that all areas block forms under time domain in image, i, j are respectively the positive integer for being more than or equal to 1, and i ≠ j;
Second determines submodule, is configured as using comprehensive difference value maximum in region unit as the full side of each region unit Potential difference score value, the comprehensive difference diagram of the comprehensive difference value compositing area block of all areas block.
11. device according to claim 10, which is characterized in that second computing module further include: maximum value calculation Submodule;
Maximum value calculation submodule is configured as described second and determines that submodule makees comprehensive difference value maximum in region unit Before the comprehensive difference value of each region unit, according to the comprehensive difference value zoning of pixel in each region unit Maximum comprehensive difference value in block.
12. device according to claim 11, which is characterized in that second computing module further include:
Frequency-domain transform submodule is configured as going to each of image to be chosen image I using formula F=fft (I) Frequency domain;
Wherein, fft (I) expression treats each of image of selection image I and carries out Fourier transformation;
Submodule is filtered, each of image to be chosen in the frequency domain for converting the frequency-domain transform submodule is configured as Image is filtered.
13. device according to claim 8, which is characterized in that the selection module includes:
Computational submodule is chosen, the brightness evaluation value L calculated according to first computational submodule is configured as1With it is described The clarity weight map L that second computational submodule calculates2Using formula L=L1*L2Calculate the Fusion Features value L of image.
14. according to any device of claim 8-13, which is characterized in that described device further include:
Module is obtained, is configured as obtaining at least two images to be chosen.
CN201510988384.7A 2015-12-24 2015-12-24 Image choosing method, apparatus and system Active CN105654470B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510988384.7A CN105654470B (en) 2015-12-24 2015-12-24 Image choosing method, apparatus and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510988384.7A CN105654470B (en) 2015-12-24 2015-12-24 Image choosing method, apparatus and system

Publications (2)

Publication Number Publication Date
CN105654470A CN105654470A (en) 2016-06-08
CN105654470B true CN105654470B (en) 2018-12-11

Family

ID=56476785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510988384.7A Active CN105654470B (en) 2015-12-24 2015-12-24 Image choosing method, apparatus and system

Country Status (1)

Country Link
CN (1) CN105654470B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110807745B (en) * 2019-10-25 2022-09-16 北京小米智能科技有限公司 Image processing method and device and electronic equipment
CN111161198A (en) * 2019-12-11 2020-05-15 国网北京市电力公司 Control method and device of imaging equipment, storage medium and processor
CN111369531B (en) * 2020-03-04 2023-09-01 浙江大华技术股份有限公司 Image definition scoring method, device and storage device
CN116678827A (en) * 2023-05-31 2023-09-01 天芯电子科技(江阴)有限公司 LGA (land grid array) packaging pin detection system of high-current power supply module

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4720810B2 (en) * 2007-09-28 2011-07-13 富士フイルム株式会社 Image processing apparatus, imaging apparatus, image processing method, and image processing program
CN102209196B (en) * 2010-03-30 2016-08-03 株式会社尼康 Image processing apparatus and image evaluation method
KR101933454B1 (en) * 2012-09-25 2018-12-31 삼성전자주식회사 Photograph image generating method, apparatus therof, and medium storing program source thereof
CN103218778B (en) * 2013-03-22 2015-12-02 华为技术有限公司 The disposal route of a kind of image and video and device
US20150071547A1 (en) * 2013-09-09 2015-03-12 Apple Inc. Automated Selection Of Keeper Images From A Burst Photo Captured Set
CN103618855A (en) * 2013-12-03 2014-03-05 厦门美图移动科技有限公司 Photographing method and device for automatically selecting optimal image

Also Published As

Publication number Publication date
CN105654470A (en) 2016-06-08

Similar Documents

Publication Publication Date Title
RU2628494C1 (en) Method and device for generating image filter
US11475243B2 (en) Training method and device for an image enhancement model, and storage medium
CN109829501A (en) Image processing method and device, electronic equipment and storage medium
CN109522910A (en) Critical point detection method and device, electronic equipment and storage medium
CN104918107B (en) The identification processing method and device of video file
CN106651955A (en) Method and device for positioning object in picture
CN105654470B (en) Image choosing method, apparatus and system
CN104219445B (en) Screening-mode method of adjustment and device
CN110503023A (en) Biopsy method and device, electronic equipment and storage medium
CN106600530B (en) Picture synthesis method and device
CN110060215A (en) Image processing method and device, electronic equipment and storage medium
CN106778773A (en) The localization method and device of object in picture
CN110378312A (en) Image processing method and device, electronic equipment and storage medium
TW202135005A (en) Method and apparatus for removing glare in image, and electronic device and storage medium
CN107948510A (en) The method, apparatus and storage medium of Focussing
CN105335714B (en) Photo processing method, device and equipment
CN109978891A (en) Image processing method and device, electronic equipment and storage medium
CN105208284B (en) Shoot based reminding method and device
CN107563994A (en) The conspicuousness detection method and device of image
CN106339705B (en) Image acquisition method and device
CN110111281A (en) Image processing method and device, electronic equipment and storage medium
WO2020015149A1 (en) Wrinkle detection method and electronic device
CN110070143B (en) Method, device and equipment for acquiring training data and storage medium
CN106257906B (en) Image effect handles auxiliary device and image effect handles householder method
CN105426904B (en) Photo processing method, device and equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant