CN106447641A - Image generation device and method - Google Patents
Image generation device and method Download PDFInfo
- Publication number
- CN106447641A CN106447641A CN201610750445.0A CN201610750445A CN106447641A CN 106447641 A CN106447641 A CN 106447641A CN 201610750445 A CN201610750445 A CN 201610750445A CN 106447641 A CN106447641 A CN 106447641A
- Authority
- CN
- China
- Prior art keywords
- image
- information
- wavelet decomposition
- monochrome
- black white
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000004927 fusion Effects 0.000 claims abstract description 92
- 238000012545 processing Methods 0.000 claims abstract description 14
- 238000000354 decomposition reaction Methods 0.000 claims description 124
- 238000002156 mixing Methods 0.000 claims description 74
- 230000008569 process Effects 0.000 claims description 21
- 239000000284 extract Substances 0.000 claims description 14
- 230000035945 sensitivity Effects 0.000 claims description 6
- 238000002844 melting Methods 0.000 claims 1
- 230000008018 melting Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 8
- 230000006870 function Effects 0.000 description 45
- 238000004891 communication Methods 0.000 description 31
- 238000010295 mobile communication Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 239000011159 matrix material Substances 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 108090000623 proteins and genes Proteins 0.000 description 4
- 230000015556 catabolic process Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 238000000926 separation method Methods 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000004040 coloring Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 230000001629 suppression Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 101150012579 ADSL gene Proteins 0.000 description 1
- 102100020775 Adenylosuccinate lyase Human genes 0.000 description 1
- 108700040193 Adenylosuccinate lyases Proteins 0.000 description 1
- 241001062009 Indigofera Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000005314 correlation function Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000009730 ganji Substances 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000155 melt Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- VYMDGNCVAMGZFE-UHFFFAOYSA-N phenylbutazonum Chemical compound O=C1C(CCCC)C(=O)N(C=2C=CC=CC=2)N1C1=CC=CC=C1 VYMDGNCVAMGZFE-UHFFFAOYSA-N 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000002463 transducing effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20064—Wavelet transform [DWT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an image generation device and method. The device includes a first acquisition unit for acquiring a black and white image and a color image of the same acquisition object; a second acquisition unit for respectively performing image processing on the black and white image and the color image to obtain the black and white image information of the black and white image and the color image information of the color image; a determination unit for combining the black and white image information and the color image information to determine the fusion weight of the black and white image and the color image; and a fusion unit for fusing the black and white image information and the color image according to the fusion weight to generate a fusion image. According to the invention, the black and white image and the color image can be fused, the fusion value is a value determined based on the image information of the black and white image and the color image instead of a fixed value determined in advance, and because the fusion weight of different images is dynamically determined according to the image itself, the image fusion effect and the fusion image quality are further improved.
Description
Technical field
The present invention relates to areas of information technology, more particularly to a kind of video generation device and method.
Background technology
With the development of information technology, people occur in that various high-resolutions to the image quality requirements more and more higher for gathering
And the structure such as high performance camera lens.But the problems such as high performance camera lens may result in the complex structure of camera lens and high cost.
Therefore how using structure is relatively simple, lower-cost camera lens, generates high-quality image, be urgently to be resolved hurrily in prior art
Problem.
Content of the invention
In view of this, the embodiment of the present invention is expected to provide a kind of video generation device and method, it is provided that high-quality
Image.
For above-mentioned purpose is reached, the technical scheme is that and be achieved in that:
A kind of video generation device is embodiments provided, including:
First acquisition unit, for obtaining black white image and the coloured image of same acquisition target;
Second acquisition unit, for carrying out image procossing respectively to the black white image and the coloured image, obtains institute
State the monochrome image information of black white image and the color image information of the coloured image;
Determining unit, for combining the monochrome image information and the color image information, determines the black white image
Blending weight with the coloured image;
Integrated unit, for according to the blending weight, the monochrome image information and the coloured image being melted
Close, generate fusion image.
Based on such scheme, the determining unit, specifically for the monochrome image information is based on, determine the artwork master
The region contrast of each pixel region and region gradient value in picture, based on the color image information, the coloured image
In the region contrast of each pixel region and region gradient value;By the region contrast and the region gradient value phase
Take advantage of, obtain the fusion parameters of each pixel in the black white image and the coloured image respectively;Based on the fusion parameters, difference
Calculate the blending weight of the black white image and each pixel of coloured image.
Based on such scheme, the second acquisition unit, specifically for extracting the first monochrome information of the black white image;
Extract the second monochrome information and the color information of the coloured image;
The integrated unit, specifically for using the blending weight to first monochrome information and second brightness
Information preset function is processed, and obtains the 3rd monochrome information of fusion image;Believe in conjunction with the 3rd monochrome information and the color
Breath, generates the fusion image.
Based on such scheme, the second acquisition unit, specifically for respectively to the black white image and the cromogram
As carrying out wavelet decomposition, the first coefficient of wavelet decomposition of the monochrome information of the acquisition black white image, and the coloured image
Second coefficient of wavelet decomposition of monochrome information;
The integrated unit, is processed specifically for combining the blending weight and preset function, to the described first little wavelength-division
Solution parameter and the second wavelet decomposition parameter enter line function calculating, obtain the 3rd coefficient of wavelet decomposition;Using the 3rd little wavelength-division
Solution coefficient carries out image inversely processing, generates the fusion image.
Based on such scheme, the integrated unit, specifically in the first frequency range, adopting first in conjunction with the blending weight
Functional relationship obtains the 3rd little wavelength-division of the first frequency range to the first wavelet decomposition parameter and the second wavelet decomposition parameter
Solution;
In the second frequency range, the first wavelet decomposition parameter and institute described in second function relation pair is adopted in conjunction with the blending weight
The second wavelet decomposition parameter is stated, obtains the 3rd wavelet decomposition of the second frequency range;
The first function relation is different from the second function relation;Wherein, first frequency range and second frequency
Section is the frequency range of human eye difference sensitivity.
Embodiment of the present invention second aspect provides a kind of image generating method, including:
Obtain black white image and the coloured image of same acquisition target;
Carry out image procossing respectively to the black white image and the coloured image, obtain the artwork master of the black white image
As information and the color image information of the coloured image;
In conjunction with the monochrome image information and the color image information, the black white image and the coloured image is determined
Blending weight;
According to the blending weight, the monochrome image information and the coloured image are merged, generated fusion figure
Picture.
Based on such scheme, monochrome image information described in the combination and the color image information, determine the black and white
Image and the blending weight of the coloured image, including:
Based on the monochrome image information, the region contrast of each pixel region and area in the black white image is determined
Domain Grad, based on the color image information, the region contrast of each pixel region and region in the coloured image
Grad;
The region contrast is multiplied with the region gradient value, obtains the black white image and the cromogram respectively
The fusion parameters of each pixel in picture;
Based on the fusion parameters, the blending weight of the black white image and each pixel of coloured image is calculated respectively.
Based on such scheme, described carry out image procossing respectively to the black white image and the coloured image, obtain institute
The monochrome image information of black white image and the color image information of the coloured image is stated, including:
Extract the first monochrome information of the black white image;
Extract the second monochrome information and the color information of the coloured image;
Described according to the fusion value, the monochrome image information and the coloured image are merged, generation is melted
Image is closed, including:
Using the blending weight, first monochrome information and the second monochrome information preset function are processed, obtain
3rd monochrome information of fusion image;
In conjunction with the 3rd monochrome information and the color information, the fusion image is generated.
Based on such scheme, described carry out image procossing respectively to the black white image and the coloured image, obtain institute
The monochrome image information of black white image and the color image information of the coloured image is stated, including:
Carry out wavelet decomposition respectively to the black white image and the coloured image, obtain the brightness letter of the black white image
First coefficient of wavelet decomposition of breath, and the second coefficient of wavelet decomposition of the monochrome information of the coloured image;
Described according to the blending weight, the monochrome image information and the coloured image are merged, generation is melted
Image is closed, including:
Process in conjunction with the blending weight and preset function, to the first wavelet decomposition parameter and the second little wavelength-division
Solution parameter enters line function calculating, obtains the 3rd coefficient of wavelet decomposition;
Image inversely processing is carried out using the 3rd coefficient of wavelet decomposition, generate the fusion image.
Based on such scheme, blending weight described in the combination and preset function are processed, and first wavelet decomposition is joined
Number and the second wavelet decomposition parameter enter line function calculating, obtain the 3rd coefficient of wavelet decomposition, including:
In the first frequency range, the first wavelet decomposition parameter and institute described in first function relation pair is adopted in conjunction with the blending weight
The second wavelet decomposition parameter is stated, obtains the 3rd wavelet decomposition of the first frequency range;
In the second frequency range, the first wavelet decomposition parameter and institute described in second function relation pair is adopted in conjunction with the blending weight
The second wavelet decomposition parameter is stated, obtains the 3rd wavelet decomposition of the second frequency range;
The first function relation is different from the second function relation;Wherein, first frequency range and second frequency
Section is the frequency range of human eye difference sensitivity.
Black white image and coloured image can be merged by video generation device provided in an embodiment of the present invention and method,
And merge the image information determination that value is based on black white image and coloured image itself, rather than predetermined fixed value,
The syncretizing effect of image because being dynamically determined the blending weight of different images according to image itself, can be improved further, improve
The picture quality of fusion image.
Description of the drawings
Fig. 1 is a kind of schematic flow sheet of image generating method provided in an embodiment of the present invention;
Fig. 2 is the calculation process schematic diagram of blending weight provided in an embodiment of the present invention;
Fig. 3 a is the schematic diagram of width original image provided in an embodiment of the present invention;
Fig. 3 b is 1 wavelet decomposition schematic diagram of original image shown in Fig. 3 a;
Fig. 3 c is 2 wavelet decomposition schematic diagrams of original image shown in Fig. 3 a;
Fig. 3 d is 3 wavelet decomposition schematic diagrams of original image shown in Fig. 3 a;
Fig. 4 is the schematic flow sheet of another kind image generating method provided in an embodiment of the present invention;
Fig. 5 is a kind of structural representation of video generation device provided in an embodiment of the present invention;
Fig. 6 is the schematic flow sheet of another image generating method provided in an embodiment of the present invention;
Fig. 7 a is the effect diagram of black white image;
Fig. 7 b is the effect diagram of the corresponding coloured image of Fig. 7 a;
Fig. 8 is the syncretizing effect schematic diagram of Fig. 7 a and image shown in Fig. 7 b;
Fig. 9 is a kind of structural representation of mobile terminal provided in an embodiment of the present invention;
Figure 10 is a kind of structural representation of communication system provided in an embodiment of the present invention.
Specific embodiment
Below in conjunction with Figure of description and specific embodiment, technical scheme is further elaborated.
As shown in figure 1, the present embodiment provides a kind of image generating method, including:
Step S110:Obtain black white image and the coloured image of same acquisition target;
Step S120:Carry out image procossing respectively to the black white image and the coloured image, obtain the artwork master
The monochrome image information of picture and the color image information of the coloured image;
Step S130:In conjunction with the monochrome image information and the color image information, the black white image and institute is determined
State the blending weight of coloured image;
Step S140:According to the blending weight, the monochrome image information and the coloured image are merged, raw
Become fusion image.
A kind of image generating method is present embodiments provided, by black white image and coloured image to generate a fusion figure
Picture, can be to be applied in the equipment of various image collecting devices, for example, the mobile terminal such as mobile phone, flat board, Wearable device or
Equipment during desktop computer etc. is fixing.
Image information will be extracted by image procossing in step s 110.For example, in step s 110 can be by parsing
Each described image obtains following image information:
Monochrome information (Y passage) and color (UV passage) sharp separation out, so can be with individual processing brightness and colors
Information.RGB is as follows with the conversion formula of YUV image:
Y=0.299*R+0.587*G+0.114*B
U=-0.147*R-0.289*G+0.436*B=0.492* (B-Y)
V=0.615*R-0.515*G-0.100*B=0.877* (R-Y)
The R represents red color-values, and the B represents blue color-values;The G represents the color-values of green.
Step S110 may include:Black white image Y is extracted, extracts Y, the U in coloured image and V-value.
Respectively image procossing is carried out to black white image and coloured image in the step s 120, for example, extracted each in image
The Pixel Information of individual pixel, for example, the various image informations such as the pixel intensity of each pixel, gray value.In a word, in the present embodiment
The image information of middle black white image is referred to as monochrome image information, and the image information of coloured image is referred to as color image information.
In conjunction with monochrome image information and color image information, black white image and colour can be dynamically determined in step s 130
The blending weight of information.Have rejected in the present embodiment directly carries out image co-registration using static state blending weight set in advance,
Like this, can according to different coloured images and the image information of black white image, select properly current black white image and
The blending weight of coloured image, so as to lift the syncretizing effect of fusion image.
In step S140, by according to the blending weight being dynamically determined, black white image and coloured image are merged,
Merging pixel-by-pixel can be carried out, so as to be tied based on the monochrome image information that extracts and color image information in the present embodiment
The fusion image of the advantage of black white image and coloured image has been closed, so as to one be obtained individually with respect to black white image and colour
The higher fusion image of image all picture quality.As such, it is possible to reduce the hardware requirement of image acquisition structure, reduce hardware and become
This.
For example, current mobile platform evolves to binocular even many ocular heads from single camera lens, and different camera lenses have difference
Advantage and shortcoming.Traditional colored camera lens is Bel's bayer sensor of three primary colours (RGB) mostly, each color channel, meeting
Different coloured light is filtered, and black and white camera lens MONO, colouring information is recorded without color separation filter, is full impregnated light sensing
Device, all light are all passed directly to sensor and are captured, bigger than common color sensor light-inletting quantity, can effectively improve pixel
Details, it is more clear to be imaged, and details is enriched, but can lose colouring information.If necessary at the same obtain image scene color and
Detailed information is thought, needs for the image of two video cameras to carry out fusion treatment, will scheme in conjunction with wavelet decomposition etc. in the present embodiment
As treatment technology, and based on partial gradient image clearly decision method carrying out fusion treatment to black and white and coloured image, can
To be effectively retained color and the detailed information of image.
In certain embodiments, as shown in Fig. 2 step S130 may include:
Step S131:Based on the monochrome image information, the region of each pixel region in the black white image is determined
Contrast and region gradient value, based on the color image information, the region pair of each pixel region in the coloured image
Than degree and region gradient value;
Step S132:The region contrast is multiplied with the region gradient value, obtain respectively the black white image and
The fusion parameters of each pixel in the coloured image;
Step S133:Based on the fusion parameters, the fusion of the black white image and each pixel of coloured image is calculated respectively
Weights.
In the present embodiment, region contrast and the region ladder of the black white image and coloured image region can be extracted
Angle value.One other pixel region can be the rectangular graph region centered on the pixel in the present embodiment.For example, wrap
The region of N*M pixel is included, is the pixel region positioned at center in this N*M pixel.
Step S131 can utilize equation below to calculate the region contrast in certain embodiments;
I, the j are the coordinate of pixel p;The CLi,jRegion contrast for pixel p region;P (i, j) is
First image information value of pixel p;M (i, j) is the first image information value of each pixel of the pixel p region
Meansigma methodss;The N is the number of pixels in the first dimension of the pixel p region;The M is the pixel p location
Number of pixels in second dimension in domain;First dimension perpendicular is in second dimension.The N*M is the pixel p institute
In total number of pixels in region, the generally p is the center pixel of the pixel region.
When image procossing is carried out to black white image, the described first image value of information is the monochrome image information;Right
When coloured image is processed, the described first image value of information is the color image information value, in region below gradient calculation
When, it is also suitable.
In the present embodiment using above-mentioned formula can be easy calculate the region contrast.Further, institute
Stating step S131 can utilize equation below to calculate the region gradient value;
I, the j are the coordinate of pixel p;The I is the image information matrix of pixel p region described in I image;
Described image information matrix is made up of the first image information value of each pixel in the pixel p region;
DescribedFor pixel p region Grad in the x direction;DescribedFor the pixel p region
Grad in y-direction;The GLi,jRegion gradient value for the pixel p region;The A1 is first volume integrating
Son;The A2 is the second convolution operator.
The such as A1 and A2 are the operator for carrying out convolution algorithm, and all corresponding is matrix.The square of the convolution operator
Battle array size determines the dimension of the transverse and longitudinal coordinate with regard to convolution algorithm.The convolution operator of another kind of 3*3 presented below.
Described
Described
The convolution operator can be also 5.5 or 7.7 in the present embodiment.It should be noted that generally doing convolution fortune
During calculation, the I is to go together columns matrix with the convolution operator.For example, the convolution operator is the matrix of 3.*3, then the I
Also it is the matrix of 3*3.
In certain embodiments, step S33 may include:
The blending weight is calculated using equation below;
It is fusion parameters WM (i, j) of the pixel of (i, j) that described WM (i, j, s) opens coordinate in image for s;The WM
It is fusion parameters WM (i, j) of the pixel of (i, j) that (i, j, k) opens coordinate in image for kth;W (i, the j, s) s opens image
Middle coordinate is the blending weight of the pixel of (i, j).If only including a black white image and a coloured image in the present embodiment
Then, the K is that the 2nd image is the coloured image 2, if when the 1st image is black white image.
DescribedFor coordinate in each image for (i, j) pixel fusion parameters sum;The W (i,
J, s) be coordinate in each image for (i, j) pixel above-mentioned and in shared ratio, the as blending weight.
Obviously using said method, blending weight is calculated, easy have the characteristics that realizing.During concrete implementation, can
According to image co-registration feature, for example, to think the prominent feature of image for embodying a certain image, may be incorporated into Dynamic gene, should
The Dynamic gene of image is more than the Dynamic gene of other images, and the Dynamic gene can be additive factor or multiplication factor.Described plus
After the method factor introduces a, it is possible to use equation below calculates full fusion weight:
Or
After multiplication factor b is introduced, it is possible to use equation below calculates fusion weight.
In a word, the method more than one of the blending weight is calculated.
In certain embodiments, step S140 may include:
Second image information value is calculated using equation below;
F (i, j) for coordinate for (i, j) pixel the second coefficient of wavelet decomposition;The kwavek(i, j) is image
In k, coordinate is the first coefficient of wavelet decomposition of the pixel of (i, j);Described W (i, j, k) is that in image k, coordinate is the pixel of (i, j)
The blending weight.A kind of functional relationship for carrying out image co-registration based on blending weight only there is provided certainly here, specifically
Above-mentioned formula is not limited to when realizing.
In certain embodiments, step S120 may include:Extract the first monochrome information of the black white image;Extract
Second monochrome information of the coloured image and color information;
Step S140 may include:Using the blending weight, first monochrome information and second brightness are believed
Breath preset function process, obtains the 3rd monochrome information of fusion image;In conjunction with the 3rd monochrome information and the color information,
Generate the fusion image.
In certain embodiments, step S120 may also include:Respectively to the black white image and the coloured image
Carry out wavelet decomposition, obtain the first coefficient of wavelet decomposition of the monochrome information of the black white image, and the coloured image is bright
Second coefficient of wavelet decomposition of degree information;
Step S140 may include:
Process in conjunction with the blending weight and preset function, to the first wavelet decomposition parameter and the second little wavelength-division
Solution parameter enters line function calculating, obtains the 3rd coefficient of wavelet decomposition;
Image inversely processing is carried out using the 3rd coefficient of wavelet decomposition, generate the fusion image.
Wavelet decomposition can be carried out in the present embodiment to above-mentioned black white image and coloured image.Obtain wavelet decomposition system
Number;The coefficient of wavelet decomposition of black white image can be described as the first coefficient of wavelet decomposition, and the coefficient of wavelet decomposition of coloured image can be the
Two coefficient of wavelet decomposition.Here coefficient of wavelet decomposition is the composition portion of aforementioned monochrome image information and color image information
Point.Using the first coefficient of wavelet decomposition and the second coefficient of wavelet decomposition, permissible obtain generating the 3rd little wavelength-division of fusion image
Solution coefficient, finally can obtain fusion image using the 3rd coefficient of wavelet decomposition by image inverse transformation.Wavelet decomposition be by from
The image information of different dimensions, for example, monochrome information and color information is extracted in image;Monochrome information and color information concrete
Value can be represented with coefficient of wavelet decomposition.The inherent character of wavelet decomposition is believed without information loss and redundancy in catabolic process
Breath;Picture breakdown can be become the combination of the average image and detail pictures, represent the different structure of image respectively, therefore easily
Extract structural information and the detailed information of original image.In addition wavelet decomposition also has fast algorithm, and generally 2-d wavelet divides
Solution can provide the selective image for matching with human visual system direction.
Fig. 3 a is original image;Fig. 3 b is 1 wavelet decomposition image;Fig. 3 c be in 1 wavelet decomposition image small echo again
Decompose the 2 wavelet decomposition images for obtaining;Fig. 3 d is to decompose 3 wavelet decomposition for obtaining on 2 wavelet decomposition images again
Image.
Certainly, when implementing, process is carried out to image wavelet decomposition is not limited to, directly can also be extracted by brightness,
The other modes of Color Picking, obtain the described first image value of information, for example, obtain the brightness value of each pixel, color-values with
And the letter image information value such as gray value.
Further, blending weight described in the combination and preset function are processed, to the first wavelet decomposition parameter and
The second wavelet decomposition parameter enters line function calculating, obtains the 3rd coefficient of wavelet decomposition, including:
In the first frequency range, the first wavelet decomposition parameter and institute described in first function relation pair is adopted in conjunction with the blending weight
The second wavelet decomposition parameter is stated, obtains the 3rd wavelet decomposition of the first frequency range;
In the second frequency range, the first wavelet decomposition parameter and institute described in second function relation pair is adopted in conjunction with the blending weight
The second wavelet decomposition parameter is stated, obtains the 3rd wavelet decomposition of the second frequency range;
The first function relation is different from the second function relation;Wherein, first frequency range and second frequency
Section is the frequency range of human eye difference sensitivity.
Shown in Fig. 4, being the image generating method for being provided using the present embodiment, different frequency range is carried out using different fusion rules
Carry out the schematic flow sheet of image co-registration.
Source images A and source images B can open any two in image for aforementioned K, carry out wavelet decomposition respectively, obtain respectively
High-frequency sub-band coefficient and low frequency sub-band coefficient.Here high-frequency sub-band coefficient and low frequency sub-band coefficient are all wavelet decomposition above
Coefficient.So latter two source images high-frequency sub-band coefficient is utilized high frequency fusion rule, and high frequency blending weight obtains fusion figure
The high-frequency sub-band coefficient of picture;The low frequency sub-band of two source images is obtained using low frequency sub-band rule and the fusion of low frequency blending weight
Low frequency sub-band coefficient.Then fusion image is finally given by inversion.Here high-frequency sub-band can be more than some for frequency
The frequency of frequency threshold, low frequency sub-band can be less than the frequency of frequency threshold for frequency.
As shown in figure 5, the present embodiment provides a kind of video generation device, including:
First acquisition unit 310, for obtaining black white image and the coloured image of same acquisition target;
Second acquisition unit 320, for carrying out image procossing to the black white image and the coloured image respectively, obtains
The monochrome image information of the black white image and the color image information of the coloured image;
Determining unit 330, for combining the monochrome image information and the color image information, determines the artwork master
Picture and the blending weight of the coloured image;
Integrated unit 340, for according to the blending weight, the monochrome image information and the coloured image being carried out
Fusion, generates fusion image.
Described image generating means provided in an embodiment of the present invention can be the structure in various electronic equipments, and the electronics sets
For the various mobile terminals such as mobile phone, panel computer, Wearable device are may include, can also be various irremovable fixing whole
End.
First acquisition unit 310, second acquisition unit 320, determining unit 330 and integrated unit 340 in the present embodiment
Can correspond to the processor in electronic equipment or process circuit.The processor may include central processor CPU, microprocessor
Device MCU, digital signal processor DSP, programmable array PLC or application processor AP etc..The process circuit may include special
Integrated circuit ASIC.
The processor or process circuit can complete above-mentioned functions by executing predetermined instruction.The second acquisition unit
320th, determining unit 330 and integrated unit 340 etc. are also can correspond to computer or the processor with computing function etc., permissible
The information needed for unit is calculated by some functions for arranging, e.g., the region contrast or region gradient value
Deng.
Video generation device described in the present embodiment can be by obtaining the black white image and the colour that include same acquisition target
Image, can generate the higher fusion image of the acquisition target picture quality, to reduce the requirement to acquisition hardware, reduce hardware
Cost simultaneously lifts picture quality.
In certain embodiments, the determining unit 330, specifically for being based on the monochrome image information, determines described
The region contrast of each pixel region and region gradient value in black white image, based on the color image information, the coloured silk
The region contrast of each pixel region and region gradient value in color image;By the region contrast and the region gradient
Value is multiplied, and obtains the fusion parameters of each pixel in the black white image and the coloured image respectively;Based on the fusion parameters,
The blending weight of the black white image and coloured image each pixel is calculated respectively.
Determining unit 330 described in the present embodiment may correspond to computer or the processor with computing function, using front
The functional relationship that states, calculates the region contrast and region gradient value, obtains the fusion value so as to easy.
In certain embodiments, the second acquisition unit 320, specifically for extracting the first brightness of the black white image
Information;Extract the second monochrome information and the color information of the coloured image;The integrated unit 340, specifically for utilizing
State blending weight first monochrome information and the second monochrome information preset function is processed, obtain the 3rd of fusion image the
Monochrome information;In conjunction with the 3rd monochrome information and the color information, the fusion image is generated.
The black white image for carrying out in the present embodiment and the fusion of black white image, therefore can only extract the brightness of black white image
Information, and coloured image can extract monochrome information and color information, the fusion image for so obtaining has the bright of black white image
Degree and contrast, while had the color of coloured image concurrently, it is clear that merged the advantage of black white image and coloured image, relatively singly
One black white image and the picture quality of coloured image are all high.
In certain embodiments, the second acquisition unit 320, specifically for respectively to the black white image and the coloured silk
Color image carries out wavelet decomposition, obtains the first coefficient of wavelet decomposition of the monochrome information of the black white image, and the cromogram
Second coefficient of wavelet decomposition of the monochrome information of picture;
The integrated unit 140, is processed specifically for combining the blending weight and preset function, to first small echo
Resolution parameter and the second wavelet decomposition parameter enter line function calculating, obtain the 3rd coefficient of wavelet decomposition;Using the 3rd small echo
Decomposition coefficient carries out image inversely processing, generates the fusion image.
Using wavelet decomposition, black white image and coloured image are processed in the present embodiment, obtain and characterize black white image
Coefficient of wavelet decomposition with image confidence in coloured image.Therefore in fusion, the coefficient of wavelet decomposition that integrated unit 140 is carried out
Fusion, finally using the image inversion of coefficient of wavelet decomposition, obtains fusion image.It is few that wavelet decomposition has image information loss
And the feature that redundancy is few, therefore little through such image fault for obtaining that processes, and with the few and storage resource of amount of calculation
Take few feature.
In certain embodiments, the integrated unit 140, specifically in the first frequency range, adopting in conjunction with the blending weight
With the first wavelet decomposition parameter described in first function relation pair and the second wavelet decomposition parameter, the 3rd of the first frequency range the is obtained
Wavelet decomposition;In the second frequency range, in conjunction with the blending weight using the first wavelet decomposition parameter described in second function relation pair and
The second wavelet decomposition parameter, obtains the 3rd wavelet decomposition of the second frequency range;The first function relation and second letter
Number relation difference;Wherein, first frequency range and second frequency range are the frequency range of the different sensitivitys of human eye.
Frequency ranges different in the present embodiment, carries out image co-registration using different fusion functions, on the one hand can protect
Card fused image quality, on the other hand can be processed as far as possible in minimizing.
Several specific embodiments are provided below in conjunction with above-mentioned any embodiment:
As shown in fig. 6, the present embodiment provides a kind of image generating method, including:
S100:Obtain colored and black white image, and pretreatment.For example using black and white camera lens, black white image is gathered, using coloured silk
Look mirror head gathers coloured image;It is preferred that while using black and white camera lens and the binocular equipment of colored camera lens, entering to same acquisition target
Row image acquisition.Here acquisition target may include the various objects for being available for carrying out image acquisition such as people, scenery, animal.
S200:Coloured image and black white image are carried out wavelet decomposition respectively.
S300:Calculate low frequency blending weight and high frequency blending weight.
S400:According to blending weight, coloured image and black white image are merged, obtain fusion image.
In the step s 100, as black white image only has monochrome information, without colour information, black and white and Color Image Fusion
It is to need to merge monochrome information, the color of fusion image is all from coloured image, it is therefore desirable to by colored RGB
Image is converted to YUV image, yuv format by the monochrome information (Y passage) of image and color (UV passage) sharp separation out, this
Sample can be with individual processing brightness and color information.
Respectively the Y channel image of coloured image and black white image is decomposed into high frequency using wavelet decomposition in step s 200
And low frequency part.After wavelet decomposition of step S300 in S200, luminance component Y and the black white image of coloured image have been obtained
Luminance component Y coefficient of wavelet decomposition.
Fig. 7 a is the display renderings of black white image, and Fig. 7 b is coloured image design sketch.By observation it is found that black and white
Image is preferable in the contrast of round frame intra-zone, and image detail is clear, can clearly see the English word of the inside clearly, and color
The details of color image is then relatively fuzzyyer, it is impossible to differentiate the English word inside red frame.Fusion image is exactly to need to retain in scene
Clear detailed information.In view of this, as shown in figure 8, after wavelet decomposition, respectively to high-frequency sub-band and low frequency sub-band at
Reason, need by these relatively clearly the coefficient of details remain.In the diagram, source images A and source images B can be aforementioned black
White image and coloured image, carry out wavelet decomposition respectively, obtain high-frequency sub-band coefficient and low frequency sub-band coefficient respectively.Here height
Frequency sub-band coefficients and low frequency sub-band coefficient are all coefficient of wavelet decomposition above.So latter two source images by high-frequency sub-band coefficient
Using high frequency fusion rule, and high frequency blending weight obtains the high-frequency sub-band coefficient of fusion image;Low frequency by two source images
Subband obtains low frequency sub-band coefficient using low frequency sub-band rule and the fusion of low frequency blending weight.Then finally given by inversion
Fusion image.Here high-frequency sub-band can be more than the frequency of some frequency threshold for frequency, and low frequency sub-band can be less than for frequency
The frequency of frequency threshold.Here high frequency blending weight and low frequency blending weight are one kind of foregoing fusion weights.
Therefore, this example carries out local area contrast and global gradient image spy to the coefficient of wavelet decomposition after decomposition
Calculating is levied, black and white and Color Image Fusion weights are generated, calculating process is as follows:
WM (i, j)=CLi,j*GLi,j(1)
In formula, i, j are the coordinate of any pixel p point in image, and WM (i, j) is that the pixel participates in the initial of blending algorithm
Change weights, CLi,jFor the local area contrast of the pixel, GLi,jGrad size for the pixel.
In formula (2), p (i, j) is the pixel value of the pixel, and m (i, j) is regional area meansigma methodss.Utilize in formula (3)
Soble operator calculates image in gradient magnitude both horizontally and vertically.Matrix of the operator comprising two groups of 3x3, respectively horizontal
To and longitudinal direction, it is made planar convolution with image, you can draw the brightness difference approximation of transverse direction and longitudinal direction respectively.If with I
Original image is represented, Gx and Gy represents the image for detecting through longitudinal direction and transverse edge respectively, and G is the gradient that the image pixel is pointed out
Size, its formula is as follows:
According to above-mentioned calculating process, the blending weight figure of black and white and colored two width image can be calculated respectively
In formula 4, (I is k) blending weight of each pixel in kth width image, thus by the fusion of different exposure images to WM
Coefficient weights normalization, meetsAfter ensure that image co-registration, pixel is without departing from original codomain model
Enclose.Just can be by the coefficient of wavelet decomposition of 3 width picture breakdowns is merged, and high frequency coefficient and low frequency coefficient melt according to formula (5)
Normally consistent, it is all to be multiplied by blending weight coefficient.
From the equations above calculate as can be seen that region contrast bigger, the bigger explanation of Gradient Features, the area of the pixel
Characteristic of field is more obvious, and image detail is more clear, be black and white with color fusion image in need to retain image slices vegetarian refreshments, therefore merge
Weights are also than larger.
S400 is carried out coefficient of wavelet decomposition to the Y passage of coloured image and black white image and melts according to the blending weight of S300
The Y channel image for obtaining fusion image is closed, due to the colourless multimedia message of black white image (UV passage), therefore the UV of coloured image is led to
Road is assigned to fusion image, is converted to RGB image afterwards to fusion image YUV, and conversion formula is as follows:
R=Y+1.140*V
G=Y-0.394*U-0.581*V
B=Y+2.032*U
Fig. 8 gives fusion image effect, it is clear that the image effect with respect to Fig. 7 a and Fig. 7 b is all good.Some.
Presented below to a kind of mobile terminal that may include aforementioned video generation device.In follow-up description, using use
In represent element such as " module ", " part " or " unit " suffix only for being conducive to the explanation of the present invention, itself is simultaneously
Without specific meaning.Therefore, " module " mixedly can be used with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as to move
Phone, smart phone, notebook computer, digit broadcasting receiver, personal digital assistant (PDA), panel computer (PAD), portable
The fixation of the mobile terminal of formula multimedia player (PMP), guider etc. and such as numeral TV, desk computer etc.
Terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for mobile mesh
Element outside, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 9 is that the hardware configuration of the mobile terminal 1 00 for realizing each embodiment of the present invention is illustrated, as shown in figure 9, mobile whole
End 100 can include that wireless communication unit 110, audio/video (A/V) input block 120, user input unit 130, sensing are single
Unit 140, output unit 150, memorizer 160, interface unit 170, controller 180 and power subsystem 190 etc..Fig. 9 shows
Mobile terminal 1 00 with various assemblies, it should be understood that be not required for implementing all components for illustrating.Can substitute
More or less of component is implemented on ground.Will be discussed in more detail below the element of mobile terminal 1 00.
Wireless communication unit 110 generally includes one or more assemblies, and which allows mobile terminal 1 00 and wireless communication system
Or the radio communication between network.For example, wireless communication unit 110 can include broadcasting reception module 111, mobile communication mould
At least one of block 112, wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server
Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent
The broadcast singal for generating before the server or reception of broadcast singal and/or broadcast related information and/or broadcast related information
And send it to the server of terminal.Broadcast singal can include TV broadcast singal, radio signals, data broadcasting
Signal etc..And, broadcast singal may further include the broadcast singal for combining with TV or radio signals.Broadcast phase
Pass information can also be provided via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 is receiving.Broadcast singal can be present in a variety of manners, and for example, which can be with the electronics of DMB (DMB)
The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast
Receiver module 111 can receive signal broadcast by using various types of broadcast systems.Especially, broadcasting reception module 111
Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video
Broadcast-hand-held (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service
Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and be adapted to provide for extensively
Broadcast the various broadcast systems of signal and above-mentioned digit broadcasting system.Via broadcasting reception module 111 receive broadcast singal and/
Or broadcast related information can be stored in memorizer 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal can be logical including voice
Words signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal 1 00.Wireless Internet module 113 can
To be internally or externally couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by wireless Internet module 113 can include
WLAN (WLAN), Wireless Fidelity (Wi-Fi), WiMAX (Wibro), worldwide interoperability for microwave accesses
(Wimax), high-speed downlink packet accesses (HSDPA) etc..
Short range communication module 114 be for supporting the module of junction service.Some examples of short-range communication technology include indigo plant
ToothTM, RF identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybeeTMEtc..
Location information module 115 be for check or obtain mobile terminal 1 00 positional information module.Positional information mould
The typical case of block 115 is global positioning system (GPS) module 115.According to current technology, GPS module 115 is calculated from three
The range information of individual or more satellites and correct time information and for the Information application triangulation for calculating, so as to root
Calculate according to longitude, latitude and highly accurately three-dimensional current location information.Currently, the method for calculating position and temporal information
The position for calculating using three satellites and by using an other satellite correction and the error of temporal information.Additionally,
GPS module 115 can be by Continuous plus current location information in real time come calculating speed information.
A/V input block 120 is used for receiving audio or video signal.A/V input block 120 can include 121 He of camera
Mike 122,121 pairs of static images for being obtained by image capture apparatus in Video Capture pattern or image capture mode of camera
Or the view data of video is processed.Picture frame after process is may be displayed on display unit 151.Process through camera 121
Picture frame afterwards can be stored in memorizer 160 (or other storage mediums) or carry out sending out via wireless communication unit 110
Send, two or more cameras 121 can be provided according to the construction of mobile terminal 1 00.Mike 122 can be in telephone relation mould
Sound (voice data) is received via mike in formula, logging mode, speech recognition mode etc. operational mode, and can be by
Such acoustic processing is voice data.Audio frequency (voice) data after process can be changed in the case of telephone calling model
For can be sent to the form output of mobile communication base station via mobile communication module 112.Mike 122 can implement all kinds
Noise eliminate (or suppression) algorithm with eliminate noise that (or suppression) is produced during receiving and sending audio signal or
Person disturbs.
User input unit 130 can generate key input data to control mobile terminal 1 00 according to the order of user input
Various operations.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome,
Touch pad (resistance that for example, detection causes due to being touched, pressure, the sensitive component of the change of electric capacity etc.), roller, shake
Bar etc..Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch screen can be formed.
Sensing unit 140 detect mobile terminal 1 00 current state, (for example, mobile terminal 1 00 open or close shape
State), the position of mobile terminal 1 00, user for mobile terminal 1 00 the presence or absence of contact (that is, touch input), mobile terminal
100 orientation, the acceleration or deceleration movement of mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00
The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can be sensed
The sliding-type phone is opened or is cut out.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or
Whether person's interface unit 170 is coupled with external device (ED).
Interface unit 170 is connected, as at least one external device (ED), the interface that can pass through with mobile terminal 1 00.For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port (typical case is general-purpose serial bus USB port), the dress for connection with identification module
The port put, audio input/output (I/O) port, video i/o port, ear port etc..Identification module can be that storage is used
Using the various information of mobile terminal 1 00 and subscriber identification module (UIM), client identification module can be included in checking user
(SIM), Universal Subscriber identification module (USIM) etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module
The form of smart card can be taken, therefore, identifying device can be connected with mobile terminal 1 00 via port or other attachment means
Connect.
Interface unit 170 can be used for receive from external device (ED) input (for example, data message, electric power etc.) and
One or more elements that the input for receiving is transferred in mobile terminal 1 00 or can be used in 00 He of mobile terminal 1
Transmission data between external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing by which by electricity
Power provides the path of mobile terminal 1 00 from base or can serve as allowing the various command signals from base input to pass through which
It is transferred to the path of mobile terminal 1 00.The various command signals being input into from base or electric power may serve as recognizing movement eventually
Whether end 100 is accurately fitted within the signal on base.
Output unit 150 is configured to provide output signal (for example, audio frequency letter with vision, audio frequency and/or tactile manner
Number, video signal, alarm signal, vibration signal etc.).Output unit 150 can include display unit 151, audio output mould
Block 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information for processing in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity
During words call mode, display unit 151 can show and call or other communicate (for example, text messaging, multimedia files
Download etc.) related user interface (UI) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern
Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, illustrate video or figure
UI or GUI of picture and correlation function etc..
Meanwhile, when the display unit 151 and touch pad touch screen with formation superposed on one another as a layer, display unit
151 can serve as input equipment and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
A kind of.Some in these display may be constructed such that transparence to allow user from outside viewing, and this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
The embodiment that wants, mobile terminal 1 00 can include two or more display units (or other display devices), for example, move
Dynamic terminal 100 can include outernal display unit (not shown) and inner display unit (not shown).Touch screen can be used to detect
Touch input pressure and touch input position and touch input area.
Dio Output Modules 152 can be in mobile terminal 1 00 in call signal reception pattern, call mode, record mould
When under the isotypes such as formula, speech recognition mode, broadcast reception mode, that wireless communication unit 110 is received or in memorizer
In 160 store voice data transducing audio signal and be output as sound.And, dio Output Modules 152 can provide with
(for example, call signal receives sound, message sink sound etc. to the audio output of the specific function correlation that mobile terminal 1 00 is executed
Deng).Dio Output Modules 152 can include speaker, buzzer etc..
Alarm unit 153 can provide output to notify event to mobile terminal 1 00.Typical event is permissible
Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video output, alarm unit
153 can provide output in a different manner with the generation of notification event.For example, alarm unit 153 can be in the form of vibration
Output is provided, when calling, message or some other entrance communication (incoming communication) are received, alarm list
Unit 153 can provide tactile output (that is, vibrating) to notify to user.By providing such tactile output, even if
When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memorizer 160 can store software program for the process and control operation for being executed by controller 180 etc., Huo Zheke
Temporarily to store the data for having exported or will having exported (for example, telephone directory, message, still image, video etc.).And
And, memorizer 160 can be with storage with regard to the vibration of various modes that exports when touching and being applied to touch screen and audio signal
Data.
Memorizer 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many
Media card, card-type memorizer (for example, SD or DX memorizer etc.), random access storage device (RAM), static random-access storage
Device (SRAM), read only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can execute memorizer with by network connection
The network storage device cooperation of 160 store function.
Controller 180 generally controls the overall operation of mobile terminal 1 00.For example, controller 180 execute with voice call,
The control of data communication, video calling etc. correlation and process.In addition, controller 180 can be included for reproducing or playing back many
The multi-media module 181 of media data, multi-media module 181 can be constructed in controller 180, or it is so structured that and control
Device processed 180 is separated.Controller 180 can be with execution pattern identifying processing, by the handwriting input for executing on the touchscreen or figure
Piece is drawn input and is identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit
Appropriate electric power needed for part and component.
Various embodiments described herein can be with using such as computer software, hardware or its any combination of calculating
Machine computer-readable recording medium is implementing.Hardware is implemented, embodiment described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Program gate array (FPGA), processor, controller, microcontroller, microprocessor, be designed to execute function described herein
At least one in electronic unit is implementing, and in some cases, such embodiment can be implemented in controller 180.
Software is implemented, the embodiment of such as process or function can with allow to execute the single of at least one function or operation
Software module is implementing.Software code can be come by the software application (or program) that is write with any appropriate programming language
Implement, software code can be stored in memorizer 160 and be executed by controller 180.
So far, according to its function, mobile terminal 1 00 is described.Below, for the sake of brevity, description is such as folded
Sliding-type movement in various types of mobile terminal 1s 00 of type, board-type, oscillating-type, slide type mobile terminal 100 etc. is eventually
End 100 is as an example.Therefore, the present invention can be applied to any kind of mobile terminal 1 00, and be not limited to sliding-type movement
Terminal 100.
As shown in Figure 9 mobile terminal 1 00 may be constructed such that using via frame or packet transmission data all if any
Line and wireless communication system and satellite-based communication system are operating.
Referring now to the communication system that Figure 10 description is wherein operable to according to the mobile terminal 1 00 of the present invention.
Such communication system can use different air interfaces and/or physical layer.For example, used by communication system
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system
System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Figure 10, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base stations (BS) 270, base station
Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC 280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC 280 is also structured to be formed with the BSC 275 that can be couple to base station 270 via back haul link and connects
Mouthful.If back haul link can be constructed according to any one in the interface that Ganji knows, the interface include such as E1/T1, ATM,
IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system can include multiple BSC as shown in Figure 10
2750.
Each BS 270 can service one or more subregions (or region), by multidirectional antenna or the day of sensing specific direction
Each subregion that line is covered is radially away from BS 270.Or, each subregion can be by for two of diversity reception or more
Multiple antennas are covered.Each BS 270 may be constructed such that the multiple frequency distribution of support, and each frequency distribution has specific frequency
Spectrum (for example, 1.25MHz, 5MHz etc.).
Intersecting that subregion and frequency are distributed can be referred to as CDMA Channel.BS 270 can also be referred to as base station transceiver
System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single BSC
275 and at least one BS 270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS 270 can be claimed
For multiple cellular stations.
As shown in Figure 10, broadcast singal is sent to broadcsting transmitter (BT) 295 mobile terminal that operate in system
100.Broadcasting reception module 111 is arranged on to receive the broadcast for being sent by BT295 at mobile terminal 1 00 as shown in Figure 9
Signal.In fig. 10 it is shown that several satellites 300, such as can adopt global positioning system (GPS) satellite 300.Satellite 300
Help positions at least one of multiple mobile terminal 1s 00.
In Fig. 10, multiple satellites 300 are depicted, it is understood that be, it is possible to use any number of satellite is obtained to be had
Location information.GPS module 115 is generally configured to coordinate with satellite 300 to obtain the positioning that wants as shown in Figure 9
Information.Substitute GPS tracking technique or outside GPS tracking technique, it is possible to use the position of mobile terminal 1 00 can be tracked
Other technology.In addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
Used as a typical operation of wireless communication system, BS 270 receives the reverse strand from various mobile terminal 1s 00
Road signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.Each of the reception of certain base station 270
Reverse link signal is processed in specific BS 270.The data of acquisition are forwarded to the BSC 275 of correlation.BSC is provided
Call resource allocation and the mobile management function of the coordination including the soft switching process between BS 270.BSC275 will also be received
Data be routed to MSC 280, which provides the extra route service for forming interface with PSTN 290.Similarly, PSTN
290 form interfaces with MSC 280, and MSC and BSC 275 forms interface, and BSC 275 correspondingly controls BS 270 with by forward direction
Link signal is sent to mobile terminal 1 00.
In mobile terminal, the mobile communication module 112 of wireless communication unit 110 is based on the built-in access movement of mobile terminal
The necessary data (including customer identification information and authentication information) of communication network (mobile communications network such as such as 2G/3G/4G) is accessed
The business transmission mobile data (bag such as mobile communications network is the web page browsing of mobile phone users, network multimedia broadcasting
Include up mobile data and descending mobile data).
The wireless Internet module 113 of wireless communication unit 110 reality by the related protocol function of operation hotspot
The function of existing hotspot, hotspot supports multiple mobile terminal (any mobile terminals outside mobile terminal) access, leads to
The webpage that the mobile communication that crosses between multiplexing mobile communication module 112 and mobile communications network is connected as mobile phone users is clear
Look at, network multimedia is played etc., and business transmission mobile data is (logical including up mobile data and descending movement
Letter data), it is connected transmission movement as mobile terminal is substantially the mobile communication being multiplexed between mobile terminal and communication network
Communication data, therefore the flow of the mobile data of mobile terminal consumption counts movement by the charging entity of communication network side
The post and telecommunication tariff of terminal, the data flow of the mobile data that the post and telecommunication tariff for using so as to consume mobile terminal signing includes
Amount.
In several embodiments provided herein, it should be understood that disclosed apparatus and method, which can be passed through
Its mode is realized.Apparatus embodiments described above are only schematically, for example division of the unit, are only
A kind of division of logic function, can have other dividing mode, such as when actually realizing:Multiple units or component can be combined, or
Another system is desirably integrated into, or some features can be ignored, or do not execute.In addition, shown or discussed each composition portion
Coupling point each other or direct-coupling or communication connection can be the INDIRECT COUPLING of equipment or unit by some interfaces
Or communication connection, can be electrical, machinery or other forms.
The above-mentioned unit that illustrates as separating component can be or may not be physically separate, aobvious as unit
The part for showing can be or may not be physical location, you can be located at a place, it is also possible to be distributed to multiple network lists
In unit;Part or all of unit therein can be selected according to the actual needs to realize the purpose of this embodiment scheme.
In addition, each functional unit in various embodiments of the present invention can be fully integrated in a processing module, also may be used
To be each unit individually as a unit, it is also possible to which two or more units are integrated in a unit;Above-mentioned
Integrated unit both can be realized in the form of hardware, it would however also be possible to employ hardware adds the form of SFU software functional unit to realize.
One of ordinary skill in the art will appreciate that:The all or part of step for realizing said method embodiment can pass through
The hardware of programmed instruction correlation is completing, and aforesaid program can be stored in a computer read/write memory medium, the program
Upon execution, the step of including said method embodiment is executed;And aforesaid storage medium includes:Movable storage device, read-only
Memorizer (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or
Person's CD etc. is various can be with the medium of store program codes.
The above, the only specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, and any
Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all be contained
Cover within protection scope of the present invention.Therefore, protection scope of the present invention should be defined by the scope of the claims.
Claims (10)
1. a kind of video generation device, it is characterised in that include:
First acquisition unit, for obtaining black white image and the coloured image of same acquisition target;
Second acquisition unit, for carrying out image procossing respectively to the black white image and the coloured image, obtains described black
The white monochrome image information of image and the color image information of the coloured image;
Determining unit, for combining the monochrome image information and the color image information, determines the black white image and institute
State the blending weight of coloured image;
Integrated unit, for according to the blending weight, the monochrome image information and the coloured image being merged, raw
Become fusion image.
2. device according to claim 1, it is characterised in that
The determining unit, specifically for being based on the monochrome image information, determines each pixel location in the black white image
The region contrast in domain and region gradient value, based on the color image information, each pixel region in the coloured image
Region contrast and region gradient value;The region contrast is multiplied with the region gradient value, is obtained respectively described black
The fusion parameters of each pixel in white image and the coloured image;Based on the fusion parameters, the black white image is calculated respectively
Blending weight with each pixel of coloured image.
3. device according to claim 1, it is characterised in that
The second acquisition unit, specifically for extracting the first monochrome information of the black white image;Extract the coloured image
The second monochrome information and color information;
The integrated unit, specifically for using the blending weight to first monochrome information and second monochrome information
Preset function is processed, and obtains the 3rd monochrome information of fusion image;In conjunction with the 3rd monochrome information and the color information, raw
Become the fusion image.
4. the device according to claim 1,2 or 3, it is characterised in that
The second acquisition unit, specifically for carrying out wavelet decomposition to the black white image and the coloured image respectively, obtains
Obtain the first coefficient of wavelet decomposition of the monochrome information of the black white image, and the second small echo of the monochrome information of the coloured image
Decomposition coefficient;
The integrated unit, processes specifically for combining the blending weight and preset function, first wavelet decomposition is joined
Number and the second wavelet decomposition parameter enter line function calculating, obtain the 3rd coefficient of wavelet decomposition;Using the 3rd wavelet decomposition system
Number carries out image inversely processing, generates the fusion image.
5. device according to claim 4, it is characterised in that
The integrated unit, specifically in the first frequency range, in conjunction with the blending weight using described in first function relation pair the
One wavelet decomposition parameter and the second wavelet decomposition parameter, obtain the 3rd wavelet decomposition of the first frequency range;
In the second frequency range, in conjunction with the blending weight using the first wavelet decomposition parameter described in second function relation pair and described the
Two wavelet decomposition parameters, obtain the 3rd wavelet decomposition of the second frequency range;
The first function relation is different from the second function relation;Wherein, first frequency range and second frequency range are
The frequency range of human eye difference sensitivity.
6. a kind of image generating method, it is characterised in that include:
Obtain black white image and the coloured image of same acquisition target;
Carry out image procossing respectively to the black white image and the coloured image, obtain the black white image letter of the black white image
Breath and the color image information of the coloured image;
In conjunction with the monochrome image information and the color image information, melting for the black white image and the coloured image is determined
Close weights;
According to the blending weight, the monochrome image information and the coloured image are merged, generated fusion image.
7. method according to claim 6, it is characterised in that
Monochrome image information described in the combination and the color image information, determine the black white image and the coloured image
Blending weight, including:
Based on the monochrome image information, the region contrast of each pixel region and region ladder in the black white image is determined
Angle value, based on the color image information, the region contrast of each pixel region and region gradient in the coloured image
Value;
The region contrast is multiplied with the region gradient value, is obtained in the black white image and the coloured image respectively
The fusion parameters of each pixel;
Based on the fusion parameters, the blending weight of the black white image and each pixel of coloured image is calculated respectively.
8. method according to claim 7, it is characterised in that
The artwork master for carrying out image procossing respectively to the black white image and the coloured image, obtaining the black white image
Picture information and the color image information of the coloured image, including:
Extract the first monochrome information of the black white image;
Extract the second monochrome information and the color information of the coloured image;
Described according to the fusion value, the monochrome image information and the coloured image are merged, generate fusion figure
Picture, including:
Using the blending weight, first monochrome information and the second monochrome information preset function are processed, merged
3rd monochrome information of image;
In conjunction with the 3rd monochrome information and the color information, the fusion image is generated.
9. the method according to claim 6,7 or 8, it is characterised in that
The artwork master for carrying out image procossing respectively to the black white image and the coloured image, obtaining the black white image
Picture information and the color image information of the coloured image, including:
Carry out wavelet decomposition respectively to the black white image and the coloured image, obtain the monochrome information of the black white image
First coefficient of wavelet decomposition, and the second coefficient of wavelet decomposition of the monochrome information of the coloured image;
Described according to the blending weight, the monochrome image information and the coloured image are merged, generate fusion figure
Picture, including:
Process in conjunction with the blending weight and preset function, the first wavelet decomposition parameter and second wavelet decomposition are joined
Number enters line function and calculates, and obtains the 3rd coefficient of wavelet decomposition;
Image inversely processing is carried out using the 3rd coefficient of wavelet decomposition, generate the fusion image.
10. method according to claim 9, it is characterised in that
Blending weight described in the combination and preset function are processed, to the first wavelet decomposition parameter and the second little wavelength-division
Solution parameter enters line function calculating, obtains the 3rd coefficient of wavelet decomposition, including:
In the first frequency range, in conjunction with the blending weight using the first wavelet decomposition parameter described in first function relation pair and described the
Two wavelet decomposition parameters, obtain the 3rd wavelet decomposition of the first frequency range;
In the second frequency range, in conjunction with the blending weight using the first wavelet decomposition parameter described in second function relation pair and described the
Two wavelet decomposition parameters, obtain the 3rd wavelet decomposition of the second frequency range;
The first function relation is different from the second function relation;Wherein, first frequency range and second frequency range are
The frequency range of human eye difference sensitivity.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610750445.0A CN106447641A (en) | 2016-08-29 | 2016-08-29 | Image generation device and method |
PCT/CN2017/092539 WO2018040751A1 (en) | 2016-08-29 | 2017-07-11 | Image generation apparatus and method therefor, and image processing device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610750445.0A CN106447641A (en) | 2016-08-29 | 2016-08-29 | Image generation device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106447641A true CN106447641A (en) | 2017-02-22 |
Family
ID=58182082
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610750445.0A Pending CN106447641A (en) | 2016-08-29 | 2016-08-29 | Image generation device and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106447641A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106887027A (en) * | 2017-03-13 | 2017-06-23 | 沈阳东软医疗系统有限公司 | A kind of methods, devices and systems of ultrasonic sampled-data processing |
CN106920327A (en) * | 2017-03-02 | 2017-07-04 | 上海巽晔计算机科技有限公司 | A kind of high efficiency retracting device based on image recognition |
WO2018040751A1 (en) * | 2016-08-29 | 2018-03-08 | 努比亚技术有限公司 | Image generation apparatus and method therefor, and image processing device and storage medium |
CN108111778A (en) * | 2017-12-25 | 2018-06-01 | 信利光电股份有限公司 | A kind of photographic device and electronic equipment |
CN108389165A (en) * | 2018-02-02 | 2018-08-10 | 成都西纬科技有限公司 | A kind of image de-noising method |
CN108665498A (en) * | 2018-05-15 | 2018-10-16 | 北京市商汤科技开发有限公司 | Image processing method, device, electronic equipment and storage medium |
CN110012215A (en) * | 2017-12-08 | 2019-07-12 | 索尼半导体解决方案公司 | Image processing apparatus and image processing method |
CN110298812A (en) * | 2019-06-25 | 2019-10-01 | 浙江大华技术股份有限公司 | A kind of method and device of image co-registration processing |
CN110310223A (en) * | 2019-07-03 | 2019-10-08 | 云南电网有限责任公司电力科学研究院 | A kind of fusion method of ultraviolet light and visible images |
CN110326288A (en) * | 2017-03-02 | 2019-10-11 | 索尼公司 | Image processing equipment and imaging device |
CN110456348A (en) * | 2019-08-19 | 2019-11-15 | 中国石油大学(华东) | The wave cut-off wavelength compensation method of more visual direction SAR ocean wave spectrum data fusions |
CN110876016A (en) * | 2018-08-31 | 2020-03-10 | 珠海格力电器股份有限公司 | Image processing method, apparatus and storage medium |
EP3945713A1 (en) * | 2020-07-29 | 2022-02-02 | Beijing Xiaomi Mobile Software Co., Ltd. | Image processing method and apparatus, and storage medium |
CN116528040A (en) * | 2023-07-03 | 2023-08-01 | 清华大学 | Image super-resolution reconstruction method and device based on compound eye intelligent camera and camera |
CN117676120A (en) * | 2023-12-14 | 2024-03-08 | 深圳市眼科医院(深圳市眼病防治研究所) | Intelligent vision-aiding glasses for enlarging visual field range of patient with visual field defect |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102231206A (en) * | 2011-07-14 | 2011-11-02 | 浙江理工大学 | Colorized night vision image brightness enhancement method applicable to automotive assisted driving system |
US20130028511A1 (en) * | 2010-09-16 | 2013-01-31 | Thomson Licensing | Method and device of determining a saliency map for an image |
CN103761724A (en) * | 2014-01-28 | 2014-04-30 | 中国石油大学(华东) | Visible light and infrared video fusion method based on surreal luminance contrast pass algorithm |
CN105163047A (en) * | 2015-09-15 | 2015-12-16 | 厦门美图之家科技有限公司 | HDR (High Dynamic Range) image generation method and system based on color space conversion and shooting terminal |
CN105744159A (en) * | 2016-02-15 | 2016-07-06 | 努比亚技术有限公司 | Image synthesizing method and device |
CN105827965A (en) * | 2016-03-25 | 2016-08-03 | 维沃移动通信有限公司 | Image processing method based on mobile terminal and mobile terminal |
CN105827970A (en) * | 2016-03-31 | 2016-08-03 | 维沃移动通信有限公司 | Image processing method and mobile terminal |
-
2016
- 2016-08-29 CN CN201610750445.0A patent/CN106447641A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130028511A1 (en) * | 2010-09-16 | 2013-01-31 | Thomson Licensing | Method and device of determining a saliency map for an image |
CN102231206A (en) * | 2011-07-14 | 2011-11-02 | 浙江理工大学 | Colorized night vision image brightness enhancement method applicable to automotive assisted driving system |
CN103761724A (en) * | 2014-01-28 | 2014-04-30 | 中国石油大学(华东) | Visible light and infrared video fusion method based on surreal luminance contrast pass algorithm |
CN105163047A (en) * | 2015-09-15 | 2015-12-16 | 厦门美图之家科技有限公司 | HDR (High Dynamic Range) image generation method and system based on color space conversion and shooting terminal |
CN105744159A (en) * | 2016-02-15 | 2016-07-06 | 努比亚技术有限公司 | Image synthesizing method and device |
CN105827965A (en) * | 2016-03-25 | 2016-08-03 | 维沃移动通信有限公司 | Image processing method based on mobile terminal and mobile terminal |
CN105827970A (en) * | 2016-03-31 | 2016-08-03 | 维沃移动通信有限公司 | Image processing method and mobile terminal |
Non-Patent Citations (1)
Title |
---|
王健 等: "基于YUV与小波变换的可见光与红外图像融合", 《西安工业大学学报》 * |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018040751A1 (en) * | 2016-08-29 | 2018-03-08 | 努比亚技术有限公司 | Image generation apparatus and method therefor, and image processing device and storage medium |
CN106920327A (en) * | 2017-03-02 | 2017-07-04 | 上海巽晔计算机科技有限公司 | A kind of high efficiency retracting device based on image recognition |
CN106920327B (en) * | 2017-03-02 | 2019-04-05 | 浙江古伽智能科技有限公司 | A kind of high efficiency recyclable device based on image recognition |
CN110326288B (en) * | 2017-03-02 | 2021-07-06 | 索尼公司 | Image processing apparatus and imaging apparatus |
CN110326288A (en) * | 2017-03-02 | 2019-10-11 | 索尼公司 | Image processing equipment and imaging device |
CN106887027A (en) * | 2017-03-13 | 2017-06-23 | 沈阳东软医疗系统有限公司 | A kind of methods, devices and systems of ultrasonic sampled-data processing |
CN110012215A (en) * | 2017-12-08 | 2019-07-12 | 索尼半导体解决方案公司 | Image processing apparatus and image processing method |
CN108111778A (en) * | 2017-12-25 | 2018-06-01 | 信利光电股份有限公司 | A kind of photographic device and electronic equipment |
CN108389165A (en) * | 2018-02-02 | 2018-08-10 | 成都西纬科技有限公司 | A kind of image de-noising method |
CN108665498A (en) * | 2018-05-15 | 2018-10-16 | 北京市商汤科技开发有限公司 | Image processing method, device, electronic equipment and storage medium |
CN108665498B (en) * | 2018-05-15 | 2023-05-12 | 北京市商汤科技开发有限公司 | Image processing method, device, electronic equipment and storage medium |
CN110876016A (en) * | 2018-08-31 | 2020-03-10 | 珠海格力电器股份有限公司 | Image processing method, apparatus and storage medium |
CN110876016B (en) * | 2018-08-31 | 2021-03-16 | 珠海格力电器股份有限公司 | Image processing method, apparatus and storage medium |
CN110298812A (en) * | 2019-06-25 | 2019-10-01 | 浙江大华技术股份有限公司 | A kind of method and device of image co-registration processing |
CN110310223A (en) * | 2019-07-03 | 2019-10-08 | 云南电网有限责任公司电力科学研究院 | A kind of fusion method of ultraviolet light and visible images |
CN110456348A (en) * | 2019-08-19 | 2019-11-15 | 中国石油大学(华东) | The wave cut-off wavelength compensation method of more visual direction SAR ocean wave spectrum data fusions |
WO2021031466A1 (en) * | 2019-08-19 | 2021-02-25 | 中国石油大学(华东) | Wave cutoff wavelength compensation method for multiview sar wave spectrum data fusion |
CN114066784A (en) * | 2020-07-29 | 2022-02-18 | 北京小米移动软件有限公司 | Image processing method, device and storage medium |
KR20220014801A (en) * | 2020-07-29 | 2022-02-07 | 베이징 시아오미 모바일 소프트웨어 컴퍼니 리미티드 | Image processing method, device and storage medium |
EP3945713A1 (en) * | 2020-07-29 | 2022-02-02 | Beijing Xiaomi Mobile Software Co., Ltd. | Image processing method and apparatus, and storage medium |
KR102563468B1 (en) * | 2020-07-29 | 2023-08-04 | 베이징 시아오미 모바일 소프트웨어 컴퍼니 리미티드 | Image processing method, device and storage medium |
US11900637B2 (en) | 2020-07-29 | 2024-02-13 | Beijing Xiaomi Mobile Software Co., Ltd. | Image processing method and apparatus, and storage medium |
CN116528040A (en) * | 2023-07-03 | 2023-08-01 | 清华大学 | Image super-resolution reconstruction method and device based on compound eye intelligent camera and camera |
CN116528040B (en) * | 2023-07-03 | 2023-09-05 | 清华大学 | Image super-resolution reconstruction method and device based on compound eye intelligent camera and camera |
CN117676120A (en) * | 2023-12-14 | 2024-03-08 | 深圳市眼科医院(深圳市眼病防治研究所) | Intelligent vision-aiding glasses for enlarging visual field range of patient with visual field defect |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106447641A (en) | Image generation device and method | |
CN105430295B (en) | Image processing apparatus and method | |
CN105825485B (en) | A kind of image processing system and method | |
CN106485689B (en) | A kind of image processing method and device | |
CN106791472A (en) | A kind of exposure method and terminal | |
CN106780634A (en) | Picture dominant tone extracting method and device | |
CN105744159A (en) | Image synthesizing method and device | |
CN106210195A (en) | The changing method of a kind of double-sided screen and terminal | |
CN105455781A (en) | Information processing method and electronic device | |
CN106454105A (en) | Device and method for image processing | |
CN105956999A (en) | Thumbnail generating device and method | |
CN106713640B (en) | A kind of brightness adjusting method and equipment | |
CN106873936A (en) | Electronic equipment and information processing method | |
WO2017088680A1 (en) | Image processing apparatus and method | |
CN107438179A (en) | A kind of information processing method and terminal | |
CN106569709A (en) | Device and method for controlling mobile terminal | |
CN106791022A (en) | A kind of mobile terminal and screenshot method | |
CN106658159A (en) | Control method and first electronic equipment, and target equipment | |
CN105095903A (en) | Electronic equipment and image processing method | |
CN106303229A (en) | A kind of photographic method and device | |
CN107071263A (en) | A kind of image processing method and terminal | |
CN105405108B (en) | Image sharpening method and mobile terminal | |
CN107192937A (en) | Fingerprint chip detecting method and device, electronic equipment and storage medium | |
CN106855997A (en) | The processing method and processing device of picture | |
CN106355569A (en) | Image generating device and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170222 |