CN106960413A - A kind of method and terminal of image virtualization - Google Patents
A kind of method and terminal of image virtualization Download PDFInfo
- Publication number
- CN106960413A CN106960413A CN201710183618.XA CN201710183618A CN106960413A CN 106960413 A CN106960413 A CN 106960413A CN 201710183618 A CN201710183618 A CN 201710183618A CN 106960413 A CN106960413 A CN 106960413A
- Authority
- CN
- China
- Prior art keywords
- depth
- pixel
- present image
- image
- pixel cell
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000012545 processing Methods 0.000 claims abstract description 37
- 230000009977 dual effect Effects 0.000 claims abstract description 35
- 230000000694 effects Effects 0.000 abstract description 11
- 230000008859 change Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 239000011800 void material Substances 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000001459 lithography Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000005622 photoelectricity Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the invention discloses the method and terminal of a kind of image virtualization, wherein, methods described includes obtaining the present image captured by camera;Obtain the pixel color value of each pixel unit in the present image;Obtain the depth of field depth Depth of each pixel cell in the present image;The fuzzy parameter of each pixel cell in the present image is determined according to the depth of field depth Depth;Virtualization processing is carried out to each pixel cell in the present image according to identified fuzzy parameter and the pixel color value.The embodiment of the present invention can image that is effective, quick and accurately being shot for dual camera blurred to reach the effect of virtualization, it is low that this method to image blur computational complexity, and can improve the usage experience of user.
Description
Technical field
The present invention relates to the method and terminal of electronic technology field, more particularly to a kind of virtualization of image.
Background technology
With the development of mobile device technology and image taking technology, the virtualization technology of the image captured by dual camera
Receive significant attention.The principle of so-called dual camera virtualization refers to carry out taking pictures using two cameras handling to be blurred;
Wherein, a camera is responsible for imaging, and another camera is used for calculating the depth of field, captured by the so-called depth of field is calculated
The distance of each pixel cell or region distance camera lens in image.Follow-up software processing is carried out according to the difference of distance and reached
To the effect of virtualization.
It can not accomplish efficiently and accurately to carry out however, virtualization mode of the prior art is directed to regions different in image
Virtualization, and it is high for the computational complexity of virtualization processing so that and the virtualization of image is unable to reach the degree required for user, shadow
The usage experience of user is rung.
The content of the invention
The embodiment of the present invention provides a kind of method and terminal of image virtualization, image can be carried out for depth of field depth empty
Change processing to reach the effect of virtualization.
On the one hand, the embodiments of the invention provide a kind of method of image virtualization, this method includes:
Obtain the present image captured by camera;
Obtain the pixel color value of each pixel unit in the present image;
Obtain the depth of field depth Depth of each pixel cell in the present image;
The fuzzy parameter of each pixel cell in the present image is determined according to the depth of field depth Depth;
Each pixel cell in the present image is carried out according to identified fuzzy parameter and the pixel color value empty
Change is handled.
On the other hand, the embodiments of the invention provide a kind of terminal, the terminal includes:
First acquisition unit, for obtaining the present image captured by camera;
Second acquisition unit, the pixel color value for obtaining each pixel unit in the present image;
3rd acquiring unit, the depth of field depth Depth for obtaining each pixel cell in the present image;
First determining unit, for determining each pixel cell in the present image according to the depth of field depth Depth
Fuzzy parameter;
Processing unit is blurred, for the identified fuzzy parameter of basis and the pixel color value in the present image
Each pixel cell carries out virtualization processing.
In summary, the invention has the advantages that:The present invention by obtaining the present image captured by camera,
The pixel color value of each pixel unit in the present image is obtained, the depth of field for obtaining each pixel cell in the present image is deep
Spend Depth, the fuzzy parameter of each pixel cell in the present image is determined according to the depth of field depth Depth, according to really
Fixed fuzzy parameter and the pixel color value carry out virtualization processing to each pixel cell in the present image, so as to have
Effect, image that is quick and accurately being shot for dual camera are blurred to reach the effect of virtualization, and this method is to image
Carry out virtualization computational complexity low, and the usage experience of user can be improved.
Brief description of the drawings
Technical scheme in order to illustrate more clearly the embodiments of the present invention, embodiment will be described below needed for be used
Accompanying drawing is briefly described, it should be apparent that, drawings in the following description are some embodiments of the present invention, general for this area
For logical technical staff, on the premise of not paying creative work, other accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 is a kind of schematic flow diagram of the method for image virtualization provided in an embodiment of the present invention.
Fig. 2 is a kind of another schematic flow diagram of the method for image virtualization provided in an embodiment of the present invention.
Fig. 3 is a kind of schematic block diagram of terminal provided in an embodiment of the present invention.
Fig. 4 is a kind of another schematic block diagram of terminal provided in an embodiment of the present invention.
Fig. 5 is the structure composition schematic diagram that a kind of image provided in an embodiment of the present invention blurs equipment.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Site preparation is described, it is clear that described embodiment is a part of embodiment of the invention, rather than whole embodiments.Based on this hair
Embodiment in bright, the every other implementation that those of ordinary skill in the art are obtained under the premise of creative work is not made
Example, belongs to the scope of protection of the invention.
It should be appreciated that ought be in this specification and in the appended claims in use, term " comprising " and "comprising" be indicated
Described feature, entirety, step, operation, the presence of element and/or component, but be not precluded from one or more of the other feature, it is whole
Body, step, operation, element, component and/or its presence or addition for gathering.
It is also understood that the term used in this description of the invention is merely for the sake of the mesh for describing specific embodiment
And be not intended to limit the present invention.As used in description of the invention and appended claims, unless on
Other situations are hereafter clearly indicated, otherwise " one " of singulative, " one " and "the" are intended to include plural form.
Referring to Fig. 1, a kind of schematic flow diagram of the method for image virtualization that Fig. 1, which is the embodiment of the present invention one, to be provided.Should
Method may operate in smart mobile phone (such as Android phone, IOS mobile phones), the panel computer with shoot function and take pictures
In the terminals such as equipment (such as dual camera, video camera etc.).As shown in figure 1, the step of this method includes S101~S105.
S101, obtains the present image captured by camera.
In embodiments of the present invention, the image of current scene, the figure of the current scene are shot by the dual camera of terminal
Picture can be the character image in the image or current scene of current ambient environmental, and acquired present image is YUV
Image, in the YUV image, terminal carrys out storage image using triple channel, wherein, Y represents lightness, that is, gray value,
And U and V represent colourity, U and V effect are description colors of image saturation degree, the color for specifying image pixel;For described
YUV image includes a variety of different forms, it is necessary to analyze YUV code streams in YUV code streams, in embodiments of the present invention, described
YUV image can use a variety of YUV code streams forms, such as YUV4:4:4, YUV4:2:2 or YUV4:2:0.
S102, obtains the pixel color value of each pixel unit in the present image.
In embodiments of the present invention, the pixel color value of each pixel unit in the present image, institute are obtained by terminal
Present image is stated for YUV image, the pixel color value refers to the YUV values of YUV image;YUV image uses a variety of YUV code streams
Storage mode, the YUV values can with the storage mode of the corresponding YUV code streams of slave phase obtain, wherein, the YUV image
The storage mode of YUV code streams includes YUVY forms, UYVY forms, YUV422P, YV12 and YU12 form etc., such as
YUV code streams, which are used, belongs to YUV4:2:The YUVY storage modes of 2 forms, using the carry out YUV values of YUV code streams as shown in table 1 below
Storage,
Table 1
Start+0 | Y’00 | Cb00 | Y’01 | Cr00 | Y’02 | Cb01 | Y’03 | Cr01 |
Start+8 | Y’10 | Cb10 | Y’11 | Cr10 | Y’12 | Cb11 | Y’13 | Cr11 |
Start+16 | Y’20 | Cb20 | Y’21 | Cr20 | Y’22 | Cb21 | Y’23 | Cr21 |
Start+24 | Y’30 | Cb30 | Y’31 | Cr30 | Y’32 | Cb31 | Y’33 | Cr31 |
Wherein, Cb, Cr represent that U, V in YUV, Y ' represent the Y in YUV.In above-mentioned table 1, two adjacent Y share its phase
Two adjacent Cb, Cr, for pixel Y ' 00, Y ' 01, its Cb, Cr value are Cb00, Cr00, other pixels
YUV values are then.
Used for another example for YUV code streams and belong to YUV4:2:YV12 and the YU12 storage mode of 0 form, using such as table 2 below
The carry out YUV values storage of shown YUV code streams,
Table 2
Wherein, Cb, Cr represent that U, V in YUV, Y ' represent the Y in YUV.In above-mentioned table 2, Y, U, V component are beaten respectively
Bag, and store successively, the extracting mode of the YUV values of each pixel shares one group of UV for 4 Y-components, in addition, Y ' 00, Y '
01st, Y ' 10, Y ' 11 share Cr00, Cb00, other the like.
Further, the pixel cell is specially pixel or the block of pixels being made up of several pixels.
S103, obtains the depth of field depth Depth of each pixel cell in the present image.
In embodiments of the present invention, for the depth of field depth Depth of each pixel cell in the present image, end is passed through
The infrared sensor at end coordinates dual camera to be obtained, and specifically, the depth of field depth Depth described in the embodiment of the present invention passes through
Obtained between the module of dual camera, in addition, the depth of field depth Depth can also be by other in actual applications
Mode is obtained, and the acquisition modes of the depth of field depth Depth are not limited thereto;Each pixel cell in the present image
Depth of field degree Depth and using terminal dual camera shoot visual field in object it is corresponding from a distance from camera, wherein, thing
Body is more remote from a distance from camera, and depth of field depth Depth is bigger, and object is nearer from a distance from camera, and depth of field depth Depth is got over
It is small, it is necessary to explanation, the depth of field depth Depth is relevant also with the aperture and focal length of camera, wherein, aperture is bigger,
Depth of field depth Depth is smaller, and aperture is smaller, depth of field depth Depth is bigger, and focal length is longer, depth of field depth Depth is smaller, focal length
Shorter, depth of field depth Depth is bigger.
Further, as shown in Fig. 2 step S103 specifically includes step S201~S202.
S201, the phase difference according to produced by the distance between module of the dual camera and angle-determining.
In embodiments of the present invention, due to having in the dual camera of terminal between the module of main camera and auxiliary camera
Certain phase difference, therefore, can calculate the depth of field depth Depth of each pixel cell respectively using the phase difference.
The depth of field depth Depth of each pixel cell in S202, the present image according to the phase difference calculating.
In embodiments of the present invention, the phasometer between the module of main camera and auxiliary camera in dual camera is passed through
Calculate the depth of field depth Depth of each pixel cell in the present image.
S104, the fuzzy parameter of each pixel cell in the present image is determined according to the depth of field depth Depth.
In this hair embodiment, the mould of each pixel cell in the present image is determined according to the depth of field depth Depth
The method of paste parameter is specifically included:Obtain the greatest measure for the fog-level that the camera is pre-set;According to described advance
The greatest measure of the fog-level of setting and the depth of field depth Depth determine the fuzzy parameter of corresponding pixel cell,
Specifically, the embodiment of the present invention can be calculated for the calculating of the fuzzy parameter of each pixel cell according to equation below one
And the fuzzy parameter λ of each pixel cell in the present image is determined, the formula one is:λ=DF/TF*λMAXWherein, DETo be each
The depth of field depth of pixel cell;TFFor the distance parameter of each pixel cell;λMAXFor the fog-level set by present image most
Big value.
Specifically, in actual applications, for slr camera, in order to protrude shooting main body, it is common practice that allow the back of the body
Scape thickens, that is, blurs background.The essence of background is blurred, each point for exactly making background is in blur circle on sensitized lithography
Shape.Disperse circular diameter is bigger, and the picture of background gets over " void ", but after all, slr camera is adjusted by aperture and focal length
Virtualization.Virtualization for smart mobile phone dual camera is take pictures handling using two cameras blurring.Wherein, one
Camera is responsible for imaging, and another camera is used for calculating depth map, and so-called depth map is exactly when calculating is taken pictures in image
The distance of each pixel or region distance camera lens, the effect of virtualization is reached according to the follow-up software processing of the difference of distance progress
Really.
It should be noted that the D in above-mentioned formula oneETo pass through main camera and the mould of auxiliary camera by dual camera
The depth of field depth Depth of each pixel cell obtained by phase difference calculating between group;λMAXFor obscuring set by present image
The maximum of degree, the maximum of the fog-level set by different camera devices is differed, the λMAXCan be from camera
Obtained in the basic parameter regulation of equipment, TFFor the distance parameter of each pixel cell, refer in the present image that is gathered
Each pixel cell distance in the scene into dual camera.
S105, enters according to identified fuzzy parameter and the pixel color value to each pixel cell in the present image
Row virtualization is handled.
In embodiments of the present invention, according to identified fuzzy parameter and the pixel color value in the present image
The method that each pixel cell carries out virtualization processing is specifically included:Predefine the width of the present image;According to identified
The width of the present image and identified fuzzy parameter and the pixel color value are carried out to corresponding pixel cell
Virtualization is handled.Specifically, the method that the embodiment of the present invention carries out blurring processing to image, can be according to equation below two, formula
Three carry out virtualization processing,
The formula two is:
The formula three is:Its
In, x [k] is the pixel color value of each pixel unit in the present image;M is the width of the present image;λ is is determined
Fuzzy parameter.
It should be noted that it is existing to image carry out blur processing algorithm in, it is necessary to carry out a forward direction iteration and
The process of inverse iteration, then averages according still further to certain rule, finally completes and corresponding image is carried out at virtualization
Reason;The embodiment of the present invention is carried out to image in the algorithm of virtualization processing, as shown in above-mentioned algorithmic formula, by the width for determining image
The pixel color value of degree, the fuzzy parameter of pixel cell and pixel cell carries out virtualization processing according to algorithmic formula to image, from
And avoiding the forward direction iteration of complexity and the process of inverse iteration, it is not required that the value for carrying out average value carries out void to image again
Change, it is possible to increase the efficiency of virtualization Processing Algorithm, and the computational complexity of algorithm is low, can be carried out for each pixel cell
Have state modulator touches paste.Specifically, the finite element that x [k] is calculated as formula in formula two, refers to the current figure
The pixel color value of each pixel unit as in, that is to say, that during x [k] is acquired present image in the present embodiment above-mentioned steps
The YUV values of each pixel unit.
As seen from the above, the embodiment of the present invention obtains described current by obtaining the present image captured by dual camera
The pixel color value of each pixel unit in image, obtains the depth of field depth Depth of each pixel cell in the present image, according to
The depth of field depth Depth determines the fuzzy parameter of each pixel cell in the present image, according to identified fuzzy parameter
Virtualization processing is carried out to each pixel cell in the present image with the pixel color value, so as to effective, quick and
The image accurately shot for dual camera is blurred to reach the effect of virtualization, and this method carries out virtualization computing to image
Complexity is low, can improve the usage experience of user.
Referring to Fig. 3, a kind of above-mentioned method of image virtualization of correspondence, the embodiment of the present invention also proposes a kind of terminal, the end
End 100 includes:First acquisition unit 101, second acquisition unit 102, the 3rd acquiring unit 103, the first determining unit 104, void
Change processing unit 105.
Wherein, the first acquisition unit 101, for obtaining the present image captured by camera.In present invention implementation
In example, the image of current scene is shot by the dual camera of terminal, the image of the current scene can be current ambient environmental
Image or current scene in character image, acquired present image is YUV image, in the YUV image
In, terminal carrys out storage image using triple channel, wherein, Y represents lightness, that is, gray value, and U and V represent colourity, U and V
Effect be description colors of image saturation degree, the color for specifying image pixel;For the YUV image, it is necessary to analyze YUV
Code stream, includes a variety of different forms in YUV code streams, in embodiments of the present invention, and the YUV image can use a variety of
YUV code stream forms, such as YUV4:4:4, YUV4:2:2 or YUV4:2:0.
The second acquisition unit 102, the pixel color value for obtaining each pixel unit in the present image.At this
In inventive embodiments, the pixel color value of each pixel unit in the present image is obtained by terminal, the present image is
YUV image, the pixel color value refers to the YUV values of YUV image;YUV image uses the storage mode of a variety of YUV code streams,
The YUV values can to obtain in the storage mode of the corresponding YUV code streams of slave phase, wherein, the YUV code streams of the YUV image are deposited
Storage mode includes YUVY forms, UYVY forms, YUV422P, YV12 and YU12 form etc., such as is used for YUV code streams
Belong to YUV4:2:The YUVY storage modes of 2 forms, are stored using the carry out YUV values of YUV code streams as shown in table 3 below,
Table 3
Start+0 | Y’00 | Cb00 | Y’01 | Cr00 | Y’02 | Cb01 | Y’03 | Cr01 |
Start+8 | Y’10 | Cb10 | Y’11 | Cr10 | Y’12 | Cb11 | Y’13 | Cr11 |
Start+16 | Y’20 | Cb20 | Y’21 | Cr20 | Y’22 | Cb21 | Y’23 | Cr21 |
Start+24 | Y’30 | Cb30 | Y’31 | Cr30 | Y’32 | Cb31 | Y’33 | Cr31 |
Wherein, Cb, Cr represent that U, V in YUV, Y ' represent the Y in YUV.In above-mentioned table 3, two adjacent Y share its phase
Two adjacent Cb, Cr, for pixel Y ' 00, Y ' 01, its Cb, Cr value are Cb00, Cr00, other pixels
YUV values are then.
Used for another example for YUV code streams and belong to YUV4:2:YV12 and the YU12 storage mode of 0 form, using such as table 4 below
The carry out YUV values storage of shown YUV code streams,
Table 4
Wherein, Cb, Cr represent that U, V in YUV, Y ' represent the Y in YUV.In above-mentioned table 4, Y, U, V component are beaten respectively
Bag, and store successively, the extracting mode of the YUV values of each pixel shares one group of UV for 4 Y-components, in addition, Y ' 00, Y '
01st, Y ' 10, Y ' 11 share Cr00, Cb00, other the like.
Further, the pixel cell is specially pixel or the block of pixels being made up of several pixels.
3rd acquiring unit 103, the depth of field depth Depth for obtaining each pixel cell in the present image.
In embodiments of the present invention, for the depth of field depth Depth of each pixel cell in the present image, the infrared of terminal is passed through
Sensor coordinates dual camera to be obtained, and specifically, the depth of field depth Depth described in the embodiment of the present invention passes through in double shootings
Obtained between the module of head, in addition, the depth of field depth Depth can also be carried out otherwise in actual applications
Obtain, the acquisition modes of the depth of field depth Depth are not limited thereto;The depth of field degree of each pixel cell in the present image
Object is corresponding from a distance from camera in the visual field that Depth is shot with the dual camera of using terminal, wherein, object is from shooting
Head distance it is more remote, depth of field depth Depth is bigger, and object is nearer from a distance from camera, depth of field depth Depth it is smaller, it is necessary to
Illustrate, the depth of field depth Depth is relevant also with the aperture and focal length of camera, wherein, aperture is bigger, depth of field depth
Depth is smaller, and aperture is smaller, depth of field depth Depth is bigger, and focal length is longer, depth of field depth Depth is smaller, and focal length is shorter, the depth of field
Depth Depth is bigger.
As shown in figure 4, the 3rd acquiring unit 103 is specifically included:
Second determining unit 1031, for produced by the distance between module according to the dual camera and angle-determining
Phase difference.In embodiments of the present invention, due to having between the module of main camera and auxiliary camera in the dual camera of terminal
There is certain phase difference, therefore, the depth of field depth Depth of each pixel cell can be calculated respectively using the phase difference.
Computing unit 1032, the depth of field depth for each pixel cell in the present image according to the phase difference calculating
Depth.In embodiments of the present invention, the phase difference calculating between the module of main camera and auxiliary camera in dual camera is passed through
The depth of field depth Depth of each pixel cell in the present image.
First determining unit 104, for determining each pixel in the present image according to the depth of field depth Depth
The fuzzy parameter of unit.In this hair embodiment, each pixel list in the present image is determined according to the depth of field depth Depth
The method of the fuzzy parameter of member is specifically included:Obtain the greatest measure for the fog-level that the camera is pre-set;According to institute
The greatest measure and the depth of field depth Depth for stating the fog-level pre-set determine the fuzzy of corresponding pixel cell
Parameter, specifically, the embodiment of the present invention can be carried out for the calculating of the fuzzy parameter of each pixel cell according to equation below one
The fuzzy parameter λ of each pixel cell in the present image is calculated and determined, the formula one is:λ=DF/TF*λMAXWherein, DE
For the depth of field depth of each pixel cell;TFFor the distance parameter of each pixel cell;λMAXFor the fog-level set by present image
Maximum.
Specifically, in actual applications, for slr camera, in order to protrude shooting main body, it is common practice that allow the back of the body
Scape thickens, that is, blurs background.The essence of background is blurred, each point for exactly making background is in blur circle on sensitized lithography
Shape.Disperse circular diameter is bigger, and the picture of background gets over " void ", but after all, slr camera is adjusted by aperture and focal length
Virtualization.Virtualization for smart mobile phone dual camera is take pictures handling using two cameras blurring.Wherein, one
Camera is responsible for imaging, and another camera is used for calculating depth map, and so-called depth map is exactly when calculating is taken pictures in image
The distance of each pixel or region distance camera lens, the effect of virtualization is reached according to the follow-up software processing of the difference of distance progress
Really.
It should be noted that the D in above-mentioned formula oneETo pass through main camera and the mould of auxiliary camera by dual camera
The depth of field depth Depth of each pixel cell obtained by phase difference calculating between group;λMAXFor obscuring set by present image
The maximum of degree, the maximum of the fog-level set by different camera devices is differed, the λMAXCan be from camera
Obtained in the basic parameter regulation of equipment, TFFor the distance parameter of each pixel cell, refer in the present image that is gathered
Each pixel cell distance in the scene into dual camera.
Processing unit 105 is blurred, for the identified fuzzy parameter of basis and the pixel color value to the current figure
Each pixel cell carries out virtualization processing as in.According to identified fuzzy parameter and the pixel color value to the present image
In each pixel cell carry out the method for virtualization processing and specifically include:Predefine the width of the present image;According to being determined
The present image width and identified fuzzy parameter and the pixel color value corresponding pixel cell is entered
Row virtualization is handled.Specifically, the method that the embodiment of the present invention carries out blurring processing to image, can be according to equation below two, public affairs
Formula three carries out virtualization processing,
The formula two is:
The formula three is:Its
In, x [k] is the pixel color value of each pixel unit in the present image;M is the width of the present image;λ is is determined
Fuzzy parameter.By the virtualization processing method of images above, the efficiency of virtualization Processing Algorithm is improved, it is possible to for each
What pixel cell carried out having a state modulator touches paste.
It should be noted that it is existing to image carry out blur processing algorithm in, it is necessary to carry out a forward direction iteration and
The process of inverse iteration, then averages according still further to certain rule, finally completes and corresponding image is carried out at virtualization
Reason;The embodiment of the present invention is carried out to image in the algorithm of virtualization processing, as shown in above-mentioned algorithmic formula, by the width for determining image
The pixel color value of degree, the fuzzy parameter of pixel cell and pixel cell carries out virtualization processing according to algorithmic formula to image, from
And avoiding the forward direction iteration of complexity and the process of inverse iteration, it is not required that the value for carrying out average value carries out void to image again
Change, it is possible to increase the efficiency of virtualization Processing Algorithm, and the computational complexity of algorithm is low, can be carried out for each pixel cell
Have state modulator touches paste.Specifically, the finite element that x [k] is calculated as formula in formula two, refers to the current figure
The pixel color value of each pixel unit as in, that is to say, that during x [k] is acquired present image in the present embodiment above-mentioned steps
The YUV values of each pixel unit.
Further, first determining unit specifically for:
Obtain the greatest measure for the fog-level that the camera is pre-set;
Determined according to the greatest measure of the fog-level pre-set and the depth of field depth Depth corresponding
The fuzzy parameter of pixel cell.
Further, it is described virtualization processing unit specifically for:
Predefine the width of the present image;
According to the width and identified fuzzy parameter of the identified present image and the pixel color value pair
Corresponding pixel cell carries out virtualization processing.
As seen from the above, the embodiment of the present invention obtains described current by obtaining the present image captured by dual camera
The pixel color value of each pixel unit in image, obtains the depth of field depth Depth of each pixel cell in the present image, according to
The depth of field depth Depth determines the fuzzy parameter of each pixel cell in the present image, according to identified fuzzy parameter
Virtualization processing is carried out to each pixel cell in the present image with the pixel color value, so as to effective, quick and
The image accurately shot for dual camera is blurred to reach the effect of virtualization, and this method carries out virtualization computing to image
Complexity is low, can improve the usage experience of user.
Fig. 5 blurs the structure composition schematic diagram of equipment for a kind of image of the invention.As shown in figure 5, the equipment may include:It is defeated
Enter device 301, output device 302, R-T unit 303, memory 304 and processor 305, wherein:
The input unit 301, the input data for receiving outside access control device.In implementing, the present invention
Input unit 301 described in embodiment may include keyboard, mouse, photoelectricity input unit, acoustic input dephonoprojectoscope, touch input dress
Put, scanner etc..
The output device 302, the output data for external output access control device.In implementing, the present invention
Output device 302 described in embodiment may include display, loudspeaker, printer etc..
The R-T unit 303, for sending data to other equipment by communication link or being received from other equipment
Data.In implementing, the R-T unit 303 of the embodiment of the present invention may include the transceiving devices such as radio-frequency antenna.
The memory 304, for storing the routine data with various functions.Memory 304 in the embodiment of the present invention
The data of storage include the routine data that can be called and run.In implementing, the memory 304 of the embodiment of the present invention can be
System storage, such as, and volatile (such as RAM), the combination of non-volatile (such as ROM, flash memory etc.), or both.
In implementing, the memory 304 of the embodiment of the present invention can also be the external memory storage outside system, such as, disk, light
Disk, tape etc..
The processor 305, for running the routine data stored in the memory 304, and performs following operation:
Obtain the present image captured by camera;
Obtain the pixel color value of each pixel unit in the present image;
Obtain the depth of field depth Depth of each pixel cell in the present image;
The fuzzy parameter of each pixel cell in the present image is determined according to the depth of field depth Depth;
Each pixel cell in the present image is carried out according to identified fuzzy parameter and the pixel color value empty
Change is handled.
Further, it is described that the fuzzy of each pixel cell in the present image is determined according to the depth of field depth Depth
Parameter, is specifically included:
Obtain the greatest measure for the fog-level that the camera is pre-set;
Determined according to the greatest measure of the fog-level pre-set and the depth of field depth Depth corresponding
The fuzzy parameter of pixel cell.
Further, fuzzy parameter determined by the basis and the pixel color value are to each picture in the present image
Plain unit carries out virtualization processing, specifically includes:
Predefine the width of the present image;
According to the width and identified fuzzy parameter of the identified present image and the pixel color value pair
Corresponding pixel cell carries out virtualization processing.
Further, the depth of field depth Depth is obtained between the module of dual camera, is worked as described in the acquisition
The depth of field depth Depth of each pixel cell, is specifically included in preceding image:
Phase difference produced by the distance between module according to the dual camera and angle-determining;
The depth of field depth Depth of each pixel cell in the present image according to the phase difference calculating.
Further, the pixel cell is specially pixel or the block of pixels being made up of several pixels.
It will be understood by those skilled in the art that the embodiment of the image virtualization equipment shown in Fig. 5 is not constituted to image
The restriction that virtualization equipment is specifically constituted, in other embodiments, image virtualization equipment can include more more or less than illustrating
Part, either combines some parts or different parts arrangement.For example, in certain embodiments, image virtualization equipment can be with
Only include memory and processor, in such embodiments, the structure and function of memory and processor shown in Fig. 5 with implementing
Example is consistent, will not be repeated here.
Unit in all embodiments of the invention can pass through universal integrated circuit, such as CPU (Central
ProcessingUnit, central processing unit), or pass through ASIC (Application Specific Integrated
Circuit, application specific integrated circuit) realize.
Step in present invention method can according to actual needs the adjustment of carry out order, merge and delete.
Unit in embodiment of the present invention terminal can be combined, divided and deleted according to actual needs.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any
Those familiar with the art the invention discloses technical scope in, various equivalent modifications can be readily occurred in or replaced
Change, these modifications or replacement should be all included within the scope of the present invention.Therefore, protection scope of the present invention should be with right
It is required that protection domain be defined.
Claims (10)
1. a kind of method of image virtualization, it is characterised in that methods described includes:
Obtain the present image captured by camera;
Obtain the pixel color value of each pixel unit in the present image;
Obtain the depth of field depth Depth of each pixel cell in the present image;
The fuzzy parameter of each pixel cell in the present image is determined according to the depth of field depth Depth;
Each pixel cell in the present image is carried out at virtualization according to identified fuzzy parameter and the pixel color value
Reason.
2. the method as described in claim 1, it is characterised in that it is described determined according to the depth of field depth Depth it is described current
The fuzzy parameter of each pixel cell, is specifically included in image:
Obtain the greatest measure for the fog-level that the camera is pre-set;
Corresponding pixel is determined according to the greatest measure of the fog-level pre-set and the depth of field depth Depth
The fuzzy parameter of unit.
3. the method as described in claim 1, it is characterised in that fuzzy parameter determined by the basis and the pixel color
Value carries out virtualization processing to each pixel cell in the present image, specifically includes:
Predefine the width of the present image;
According to the width and identified fuzzy parameter of the identified present image and the pixel color value to relative
The pixel cell answered carries out virtualization processing.
4. the method as described in claim 1, it is characterised in that the depth of field depth Depth is between the module of dual camera
Obtained, the depth of field depth Depth for obtaining each pixel cell in the present image is specifically included:
Phase difference produced by the distance between module according to the dual camera and angle-determining;
The depth of field depth Depth of each pixel cell in the present image according to the phase difference calculating.
5. the method as described in claim 1, it is characterised in that the pixel cell is specially pixel or by several pixels
The block of pixels that point is constituted.
6. a kind of terminal, it is characterised in that the terminal includes:
First acquisition unit, for obtaining the present image captured by dual camera;
Second acquisition unit, the pixel color value for obtaining each pixel unit in the present image;
3rd acquiring unit, the depth of field depth Depth for obtaining each pixel cell in the present image;
First determining unit, for determining the fuzzy of each pixel cell in the present image according to the depth of field depth Depth
Parameter;
Processing unit is blurred, for the identified fuzzy parameter of basis and the pixel color value to each picture in the present image
Plain unit carries out virtualization processing.
7. terminal as claimed in claim 6, it is characterised in that first determining unit specifically for:
Obtain the greatest measure for the fog-level that the camera is pre-set;
Corresponding pixel is determined according to the greatest measure of the fog-level pre-set and the depth of field depth Depth
The fuzzy parameter of unit.
8. terminal as claimed in claim 6, it is characterised in that the virtualization processing unit specifically for:
Predefine the width of the present image;
According to the width and identified fuzzy parameter of the identified present image and the pixel color value to relative
The pixel cell answered carries out virtualization processing.
9. terminal as claimed in claim 6, it is characterised in that the depth of field depth Depth is between the module of dual camera
Obtained, the 3rd acquiring unit is specifically included:
Second determining unit, for the phase produced by the distance between module according to the dual camera and angle-determining
Difference;
Computing unit, the depth of field depth Depth for each pixel cell in the present image according to the phase difference calculating.
10. terminal as claimed in claim 6, it is characterised in that the pixel cell is specially pixel or by several pictures
The block of pixels that vegetarian refreshments is constituted.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710183618.XA CN106960413A (en) | 2017-03-24 | 2017-03-24 | A kind of method and terminal of image virtualization |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710183618.XA CN106960413A (en) | 2017-03-24 | 2017-03-24 | A kind of method and terminal of image virtualization |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106960413A true CN106960413A (en) | 2017-07-18 |
Family
ID=59470958
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710183618.XA Withdrawn CN106960413A (en) | 2017-03-24 | 2017-03-24 | A kind of method and terminal of image virtualization |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106960413A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107959778A (en) * | 2017-11-30 | 2018-04-24 | 广东欧珀移动通信有限公司 | Imaging method and device based on dual camera |
CN108008895A (en) * | 2017-12-18 | 2018-05-08 | 信利光电股份有限公司 | A kind of background-blurring method, device, equipment and computer-readable recording medium |
CN108305223A (en) * | 2018-01-09 | 2018-07-20 | 珠海格力电器股份有限公司 | Image background blurs processing method and processing device |
CN108320263A (en) * | 2017-12-29 | 2018-07-24 | 维沃移动通信有限公司 | A kind of method, device and mobile terminal of image procossing |
CN108449589A (en) * | 2018-03-26 | 2018-08-24 | 德淮半导体有限公司 | Handle the method, apparatus and electronic equipment of image |
WO2019085603A1 (en) * | 2017-11-01 | 2019-05-09 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method for image-processing and mobile terminal using dual cameras |
CN110022430A (en) * | 2018-01-10 | 2019-07-16 | 中兴通讯股份有限公司 | Image weakening method, device, mobile terminal and computer readable storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104333700A (en) * | 2014-11-28 | 2015-02-04 | 广东欧珀移动通信有限公司 | Image blurring method and image blurring device |
CN104424640A (en) * | 2013-09-06 | 2015-03-18 | 格科微电子(上海)有限公司 | Method and device for carrying out blurring processing on images |
CN105100615A (en) * | 2015-07-24 | 2015-11-25 | 青岛海信移动通信技术股份有限公司 | Image preview method, apparatus and terminal |
CN106357980A (en) * | 2016-10-19 | 2017-01-25 | 广东欧珀移动通信有限公司 | Image virtualization processing method and device as well as mobile terminal |
-
2017
- 2017-03-24 CN CN201710183618.XA patent/CN106960413A/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104424640A (en) * | 2013-09-06 | 2015-03-18 | 格科微电子(上海)有限公司 | Method and device for carrying out blurring processing on images |
CN104333700A (en) * | 2014-11-28 | 2015-02-04 | 广东欧珀移动通信有限公司 | Image blurring method and image blurring device |
CN105100615A (en) * | 2015-07-24 | 2015-11-25 | 青岛海信移动通信技术股份有限公司 | Image preview method, apparatus and terminal |
CN106357980A (en) * | 2016-10-19 | 2017-01-25 | 广东欧珀移动通信有限公司 | Image virtualization processing method and device as well as mobile terminal |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019085603A1 (en) * | 2017-11-01 | 2019-05-09 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method for image-processing and mobile terminal using dual cameras |
US10757312B2 (en) | 2017-11-01 | 2020-08-25 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method for image-processing and mobile terminal using dual cameras |
CN107959778A (en) * | 2017-11-30 | 2018-04-24 | 广东欧珀移动通信有限公司 | Imaging method and device based on dual camera |
WO2019105207A1 (en) * | 2017-11-30 | 2019-06-06 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and device for dual-camera-based imaging and storage medium |
US10616459B2 (en) | 2017-11-30 | 2020-04-07 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and device for dual-camera-based imaging and storage medium |
CN108008895A (en) * | 2017-12-18 | 2018-05-08 | 信利光电股份有限公司 | A kind of background-blurring method, device, equipment and computer-readable recording medium |
CN108320263A (en) * | 2017-12-29 | 2018-07-24 | 维沃移动通信有限公司 | A kind of method, device and mobile terminal of image procossing |
CN108305223A (en) * | 2018-01-09 | 2018-07-20 | 珠海格力电器股份有限公司 | Image background blurs processing method and processing device |
CN110022430A (en) * | 2018-01-10 | 2019-07-16 | 中兴通讯股份有限公司 | Image weakening method, device, mobile terminal and computer readable storage medium |
CN108449589A (en) * | 2018-03-26 | 2018-08-24 | 德淮半导体有限公司 | Handle the method, apparatus and electronic equipment of image |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106960413A (en) | A kind of method and terminal of image virtualization | |
CN111328448B (en) | Method and apparatus for image processing | |
CN109474780B (en) | Method and device for image processing | |
KR102179262B1 (en) | Lens distortion correction device and application processor having the same | |
US9906732B2 (en) | Image processing device, image capture device, image processing method, and program | |
KR102010712B1 (en) | Distortion Correction Method and Terminal | |
US9866750B2 (en) | Image processing device, imaging device, image processing method, and image processing program | |
CN105100615A (en) | Image preview method, apparatus and terminal | |
CN111062881A (en) | Image processing method and device, storage medium and electronic equipment | |
CN102656877A (en) | Digital image combining to produce optical effects | |
US9892495B2 (en) | Image processing device, imaging device, image processing method, and image processing program | |
US9619871B2 (en) | Image processing device, imaging apparatus, image processing method, and program | |
JP5851655B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and program | |
CN111064895B (en) | Virtual shooting method and electronic equipment | |
US9633418B2 (en) | Image processing device, imaging apparatus, image processing method, and program | |
TWI502548B (en) | Real-time image processing method and device therefor | |
CN113313661A (en) | Image fusion method and device, electronic equipment and computer readable storage medium | |
CN113542600B (en) | Image generation method, device, chip, terminal and storage medium | |
US20180204311A1 (en) | Image processing device, image processing method, and program | |
KR20160073249A (en) | Image processing device for removing color fringe | |
CN109257540B (en) | Photographing correction method of multi-photographing lens group and photographing device | |
CN115082350A (en) | Stroboscopic image processing method and device, electronic device and readable storage medium | |
CN111416937B (en) | Image processing method, image processing device, storage medium and mobile equipment | |
KR102282457B1 (en) | Method and apparatus for reducing color moire, and image processing apparatus | |
CN109816620B (en) | Image processing method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20170718 |