CN104159038B - The image formation control method and device and imaging device of shallow Deep Canvas image - Google Patents

The image formation control method and device and imaging device of shallow Deep Canvas image Download PDF

Info

Publication number
CN104159038B
CN104159038B CN201410426154.7A CN201410426154A CN104159038B CN 104159038 B CN104159038 B CN 104159038B CN 201410426154 A CN201410426154 A CN 201410426154A CN 104159038 B CN104159038 B CN 104159038B
Authority
CN
China
Prior art keywords
information
photograph
scene
image
taken
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410426154.7A
Other languages
Chinese (zh)
Other versions
CN104159038A (en
Inventor
杜琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Tech Co Ltd
Original Assignee
Beijing Zhigu Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Tech Co Ltd filed Critical Beijing Zhigu Tech Co Ltd
Priority to CN201410426154.7A priority Critical patent/CN104159038B/en
Publication of CN104159038A publication Critical patent/CN104159038A/en
Application granted granted Critical
Publication of CN104159038B publication Critical patent/CN104159038B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The embodiment of the present application discloses a kind of image formation control method and device and imaging device of shallow Deep Canvas image, and the image formation control method of the shallow Deep Canvas image of one of which includes:Determine the target depth of view information of scene to be taken the photograph;Image object picture element density distributed intelligence is determined according to the target depth of view information;The picture element density that imaging sensor is adjusted according to described image object pixel Density Distribution information is distributed;The image of scene to be taken the photograph described in described image sensor shooting after adjusted.Technical solution provided by the embodiments of the present application, can make full use of the whole pixel of imaging sensor to obtain the image of shallow Deep Canvas, more preferably meet the diversified application demand of user.

Description

The image formation control method and device and imaging device of shallow Deep Canvas image
Technical field
This application involves image acquisition technology field, more particularly to a kind of image formation control method of shallow Deep Canvas image With device and imaging device.
Background technology
The depth of field (Depth of Field, abbreviation DoF), which typically refers to pick-up lens, to be treated and takes the photograph scene and be capable of blur-free imaging Object distance range, the region in the object distance range are known as in Jiao, and the region outside the object distance range is known as afocal, it is burnt in into clear Picture, afocal can be into sharply defined image or vague images, such as according to the depth of field (or depth of the depth of field):Deep for Vistavision, afocal all can be into Jiao Sharply defined image, it is very high will to obtain requirement of the deep depth image to pick-up lens;For the shallow depth of field, Jiao Nei can into sharply defined image, afocal into Vague image.
The method for obtaining shallow Deep Canvas image usually has two kinds.It is a kind of be by adjust the aperture size of pick-up lens, The parameters such as the focal distance distance of physics focal length length, camera lens and object to be taken the photograph, make the clear On Local Fuzzy of image local of shooting, The clear blurred background of such as prospect.Another kind is to the picture that has shot using image processing software by certain fuzzy algorithmic approach Handled so that the image local after processing is obscured to realize the blur effect of similar camera lens virtualization.
The content of the invention
The brief overview on the application is given below, in order to provide the basic of some aspects on the application Understand.It should be appreciated that this general introduction is not the exhaustive general introduction on the application.It is not intended to determine the pass of the application Key or pith, nor intended limitation scope of the present application.Its purpose only provides some concepts in simplified form, In this, as the preamble in greater detail discussed later.
The embodiment of the present application provides a kind of image formation control method and device and imaging device of shallow Deep Canvas image.
On the one hand, the embodiment of the present application provides a kind of image formation control method of shallow Deep Canvas image, including:
Determine the target depth of view information of scene to be taken the photograph;
Image object picture element density distributed intelligence is determined according to the target depth of view information;
The picture element density that imaging sensor is adjusted according to described image object pixel Density Distribution information is distributed;
The image of scene to be taken the photograph described in described image sensor shooting after adjusted.
On the other hand, the embodiment of the present application additionally provides a kind of imaging control apparatus, including:
One target depth of view information determining module, for determining the target depth of view information of scene to be taken the photograph;
One object pixel Density Distribution information determination module, for determining image object picture according to the target depth of view information Plain Density Distribution information;
One picture element density adjusts module, for adjusting imaging sensor according to described image object pixel Density Distribution information Picture element density distribution;
One image collection module, the image for scene to be taken the photograph described in the described image sensor shooting after adjusted.
Another further aspect, the embodiment of the present application provide a kind of imaging device, including an imaging sensor and an above-mentioned imaging Control device, the imaging control apparatus are connected with described image sensor.
Each pixel of the embodiment of the present application imaging sensor described in image acquisition process both participates in Image Acquisition.Due to The picture element density distribution of described image sensor is adjusted according to described image object pixel Density Distribution information, and Described image object pixel Density Distribution information is that the target depth of view information of the scene to be taken the photograph according to determines, therefore, according to tune The image of scene to be taken the photograph, the clarity of the different zones of the image of acquisition show described in described image sensor acquisition after whole Difference distribution corresponding with object pixel Density Distribution information, relatively described target depth of view information need the part meeting clearly presented There are more pixels to participate in Image Acquisition, the image definition higher of this part improves Image Acquisition efficiency, and the target Depth of view information then participates in Image Acquisition, the image more mould of the part without the part clearly presented with relatively small number of pixel Paste, thus makes full use of the pixel of imaging sensor to realize the image effect of the shallow depth of field, can more preferably meet that user is various on the whole The application demand of change.
By below in conjunction with detailed description of the attached drawing to the alternative embodiment of the application, the these and other of the application Advantage will be apparent from.
Brief description of the drawings
The application can be by reference to being better understood, wherein in institute below in association with the description given by attached drawing Have and the same or similar reference numeral has been used in attached drawing to represent same or similar component.The attached drawing is together with following Describe in detail and include in the present specification and formed the part of this specification together, and for this is further illustrated The alternative embodiment of application and the principle and advantage for explaining the application.In the accompanying drawings:
Fig. 1 a are a kind of flow chart of the image formation control method of shallow Deep Canvas image provided by the embodiments of the present application;
Fig. 1 b provide the structure diagram of the adjustable imaging sensor of the first picture element density for the embodiment of the present application;
Fig. 1 c provide the structure diagram of the adjustable imaging sensor of second of picture element density for the embodiment of the present application;
Fig. 1 d provide the structure diagram of the adjustable imaging sensor of the third picture element density for the embodiment of the present application;
Fig. 1 e provide the structure diagram of the 4th kind of adjustable imaging sensor of picture element density for the embodiment of the present application;
Fig. 1 f provide imaging sensor for the embodiment of the present application and carry out picture element density tune when uneven light field encourages situation Whole Sample Scenario;
Fig. 1 g provide the structure diagram of the 5th kind of adjustable imaging sensor of picture element density for the embodiment of the present application;
Fig. 1 h provide the structure diagram of the 6th kind of adjustable imaging sensor of picture element density for the embodiment of the present application;
Fig. 1 i provide the structure diagram of the 7th kind of adjustable imaging sensor of picture element density for the embodiment of the present application;
Fig. 1 j provide the structure diagram of the 8th kind of adjustable imaging sensor of picture element density for the embodiment of the present application;
Fig. 2 is the optional light path schematic diagram of the embodiment of the present application scene to be taken the photograph;
Fig. 3 is the optional light path schematic diagram of the embodiment of the present application blur circle diameter calculation;
Fig. 4 is the optional Sample Scenario of the embodiment of the present application blur circle distribution;
Fig. 5 a are the picture element density distribution example of the embodiment of the present application imaging sensor;
Fig. 5 b are the picture element density distribution example of the imaging sensor after the embodiment of the present application adjustment;
Fig. 6 is the logic diagram of the first imaging control apparatus provided by the embodiments of the present application;
Fig. 7 is the logic diagram of second of imaging control apparatus provided by the embodiments of the present application;
Fig. 8 is the logic diagram of the third imaging control apparatus provided by the embodiments of the present application;
Fig. 9 is the logic diagram of the 4th kind of imaging control apparatus provided by the embodiments of the present application;
Figure 10 is the logic diagram of the 5th kind of imaging control apparatus provided by the embodiments of the present application;
Figure 11 is the logic diagram of the 6th kind of imaging control apparatus provided by the embodiments of the present application;
Figure 12 is a kind of logic diagram of imaging device provided by the embodiments of the present application.
It will be appreciated by those skilled in the art that element in attached drawing is just for the sake of showing for the sake of simple and clear, And be not necessarily drawn to scale.For example, the size of some elements may be exaggerated relative to other elements in attached drawing, with Just the understanding to the embodiment of the present application is helped to improve.
Embodiment
The one exemplary embodiment of the application is described in detail hereinafter in connection with attached drawing.Rise for clarity and conciseness See, do not describe all features of actual implementation mode in the description.It should be understood, however, that developing any this reality It must be made during embodiment much specific to the decision of embodiment, to realize the objectives of developer, example Such as, meet with those relevant restrictive conditions of system and business, and these restrictive conditions may be with embodiment not Change together.In addition, it will also be appreciated that although development is likely to be extremely complex and time-consuming, to having benefited from For those skilled in the art in the present disclosure, this development is only routine task.
Herein, it is also necessary to explanation is a bit, in order to avoid having obscured the application because of unnecessary details, in attached drawing and It merely depict in explanation with according to the closely related apparatus structure of the scheme of the application and/or processing step, and eliminating pair With the application relation is little, expression and description of component known to persons of ordinary skill in the art and processing.
(identical label represents identical element in some attached drawings) and embodiment below in conjunction with the accompanying drawings, to the tool of the application Body embodiment is described in further detail.Following embodiments are used to illustrate the application, but are not limited to scope of the present application.
It will be understood by those skilled in the art that the term such as " first ", " second " in the application be only used for distinguishing it is asynchronous Suddenly, equipment or module etc., neither represent any particular technology implication, also do not indicate that the inevitable logical order between them.
Fig. 1 a are a kind of flow chart of the image formation control method of shallow Deep Canvas image provided by the embodiments of the present application.This Shen Please the executive agent of image formation control method that provides of embodiment can be a certain imaging control apparatus, the imaging control apparatus can be But be not limited to take pictures, image, photograph, in the application process such as video monitoring by perform the image formation control method carry out it is static or dynamic The imaging control of state image.The equipment form of expression of the imaging control apparatus is unrestricted, such as the imaging control apparatus Can be a certain independent component, which matches somebody with somebody hop communication with including the imaging device of imaging sensor;Alternatively, the imaging control Device processed can be integrated in an imaging device for including imaging sensor as a certain function module, and the embodiment of the present application is to this It is not intended to limit.
Specifically as shown in Figure 1a, a kind of image formation control method bag of shallow Deep Canvas image provided by the embodiments of the present application Include:
S101:Determine the target depth of view information of scene to be taken the photograph.
The target depth of view information is commonly used to characterization user or imaging device to the relatively described imaging of the scene to be taken the photograph The blur-free imaging region of equipment focal plane and/or being expected relatively for degree, such as:Which object to be taken the photograph needs in the scene to be taken the photograph Want blur-free imaging, expectation requirement of fog-level of other objects to be taken the photograph, etc. in the scene to be taken the photograph.
S102:Image object picture element density distributed intelligence is determined according to the target depth of view information.
The object pixel Density Distribution information of the scene to be taken the photograph is commonly used to characterization user or equipment to finally obtaining The relatively expected of effect is presented in image.The embodiment of the present application determines image object picture element density point according to the target depth of view information Cloth information, is treated for example, can be adjusted according to the target depth of view information described in described image object pixel Density Distribution information correspondence Difference distribution is presented in the image object picture element density for taking the photograph scene difference view field, in order to which corresponding different images object pixel is close There is some difference for the image definition in the region of degree so that definite described image object pixel Density Distribution message reflection is used The shallow Deep Canvas of target image that family or imaging device obtain the scene capture to be taken the photograph is expected relatively.
S103:The picture element density that imaging sensor is adjusted according to described image object pixel Density Distribution information is distributed.
Described image sensor is the adjustable imaging sensor of picture element density, such as flexible image sensor, the flexibility Imaging sensor includes flexible substrate and the multiple images sensor pixel formed in the flexible substrate, wherein described soft Property substrate the change such as can stretch, bend when meeting certain condition to adjust the distribution of its picture element density.With reference to the figure As this adjustable characteristic of sensor pixel Density Distribution, the embodiment of the present application is believed according to described image object pixel Density Distribution Breath adjustment described image sensor picture element density distribution so that the described image sensor after adjustment picture element density distribution with Described image object pixel Density Distribution information corresponds to, or so that the picture element density distribution of described image sensor after adjustment As close possible to described image object pixel Density Distribution information.
S104:The image of scene to be taken the photograph described in described image sensor shooting after adjusted.
Each pixel of the embodiment of the present application imaging sensor described in image acquisition process both participates in Image Acquisition.Due to The picture element density distribution of described image sensor is adjusted according to described image object pixel Density Distribution information, and Described image object pixel Density Distribution information is that the target depth of view information of the scene to be taken the photograph according to determines, therefore, according to tune The image of scene to be taken the photograph, the clarity of the different zones of the image of acquisition show described in described image sensor acquisition after whole Difference distribution corresponding with object pixel Density Distribution information, relatively described target depth of view information need the part meeting clearly presented There are more pixels to participate in Image Acquisition, the image definition higher of this part improves Image Acquisition efficiency, and the target Depth of view information then participates in Image Acquisition, the image more mould of the part without the part clearly presented with relatively small number of pixel Paste, thus makes full use of the pixel of imaging sensor to realize the image effect of the shallow depth of field, can more preferably meet that user is various on the whole The application demand of change.
In above-mentioned technical proposal, the acquisition modes of the target depth of view information are unrestricted.
In a kind of optional implementation, the depth of field can be obtained and determine information;According to the depth of field determines that information determines The target depth of view information of scene to be taken the photograph.The depth of field determines that the acquisition modes of information are unrestricted, such as:The depth of field is true Determining information can be determined by user, alternatively, can be determined according to the information of user setting, alternatively, can be the presupposed information of imaging device, Alternatively, obtained from external equipment, etc..Get after the depth of field determines information, the scene to be taken the photograph can be determined according to it The target depth of view information, makes the definite very flexible of the target depth of view information, more preferably meets diversified practical application need Ask.
In another optional implementation, the object point depth information of the scene to be taken the photograph can be obtained;According to the object point Depth information determines the target depth of view information of the scene to be taken the photograph.The object point depth information of the scene to be taken the photograph may include but It is not limited to object point depth map.The acquisition modes of the object point depth information are unrestricted, for example, depth transducer or more can be used The depth map of scene to be taken the photograph described in the methods of lens camera is shot acquisition.The program after the object point depth information is obtained accordingly Determine the target depth of view information of the scene to be taken the photograph, advantageously allow the target depth of view information it is definite it is more convenient with Accurately.
The content that the target depth of view information includes may include but be not limited at least one of:The scene to be taken the photograph is opposite Range information, the target afocal mould of the target depth of field of focal plane, at least part afocal object point of the scene to be taken the photograph to focal plane Paste degree information.
(1) for example:The target depth of view information includes:The target depth of field of the scene to be taken the photograph with respect to focal plane.The situation Under, determine described image object pixel Density Distribution information, including:Described image object pixel is determined according to the target depth of field Density Distribution information.Scene to be taken the photograph is with respect to the object distance range of focal plane blur-free imaging, institute described in the usual characterization of the target depth of field State the image mesh that the image object picture element density in image object picture element density distributed intelligence in corresponding Jiao is more than its correspondence afocal Mark picture element density, in order to which the imaging of the opposite afocal part of the imaging of burnt inside points in target image is relatively sharp, thus regarding The shallow depth image effect that clear, afocal obscures in Jiao is presented in feel.
The target depth of field can use but unlimited pre-set mode obtains, for example, certain thing can be set by the user Away from scope as the target depth of field, alternatively, in another example, the optional index path of the scene to be taken the photograph with reference to shown in figure 2 can be by user By setting virtual aperture value F, system calculates depth of field Δ L according to following depth of field calculation formula:
In above formula:Δ L represents the depth of field;ΔL1The depth of field before expression;ΔL2The depth of field after expression;F represents lens focus;F represents mirror The shooting f-number of head, the embodiment of the present application represent the virtual aperture value that user can set;L represents focal distance, and δ represents to allow more Dissipate circular diameter.
The depth of field Δ L being calculated is the target depth of field, afterwards, according to the target depth of field and the field to be taken the photograph The focal plane of scape can be distinguished in Jiao, afocal region, to distinguish in Jiao, afocal region determine image object picture element density.
(2) in another example:The target depth of view information includes:At least part afocal object point of the scene to be taken the photograph is to focal plane Range information;Under the situation, described image object pixel Density Distribution information is determined, including:It is true according to the range information Determine described image object pixel Density Distribution information.For example, treated described in corresponding in described image object pixel Density Distribution information The image object picture element density in the more remote afocal object point area in focal plane described in distance in scene is taken the photograph, less than described image object pixel The image object pixel in the nearer afocal object point area in focal plane described in distance in scene to be taken the photograph described in being corresponded in Density Distribution information Density, in order to which the clarity of the imaging in different distance section in target image has differences, the object point nearer apart from focal plane Imaging is more clear and more fuzzy apart from the objective point imaging of focal plane farther out, thus visually present more closely object point into As the shallow depth image effect more clear, more remote objective point imaging is fuzzyyer.
Optionally, any image formation control method of the embodiment of the present application may also include:Determine the institute of the scene to be taken the photograph State focal plane.Determine that the implementation acquisition modes of the focal plane are unrestricted.
A kind of optional implementation, the focal plane can be according to region of interest (Region of Interest, abbreviations ROI) information determines, i.e.,:Obtain region of interest and determine information;Determine that information determines the focal plane according to the region of interest. The region of interest may include but be not limited to one or more of:The scene to be taken the photograph of user's selection is sensed in described image At least one region for the preview image that at least one region (i.e. user selects area) of the preview image of device, user watch attentively (i.e. user's field of regard), imaging device detect obtained region of interest automatically to the preview image.The program is according to the sense Region of interest determines the focal plane of the scene to be taken the photograph so that determining for the focal plane is more identical with actual user's demand, can The more preferable application demand for meeting user individual.
Another optional implementation, the focal plane of the scene to be taken the photograph can determine according to the result of graphical analysis, i.e.,: Analyze the preview image to the scene to be taken the photograph in described image sensor;Determined according to image analysis result described burnt flat Face.Such as:Recognition of face is carried out to the preview image, the focal plane of face is determined as according to recognition result described in wait to take the photograph The focal plane of scene.In another example:Mobile object identification is carried out to the preview image, according to recognition result by the phase of mobile object The focal plane in area is answered to be determined as the focal plane of the scene to be taken the photograph.The program can be according to the image analysis result of the preview image Determine the focal plane of the scene to be taken the photograph so that the focal plane of the scene to be taken the photograph it is definite more intelligent, improve described burnt put down The efficiency and universality that face determines.
(3) again for example:The target depth of view information includes:Target afocal fog-level information;Under the situation, determine described Image object picture element density distributed intelligence, including:Described image target picture is determined according to the target afocal fog-level information Plain Density Distribution information.For example, the big figure of target afocal fog-level is corresponded in described image object pixel Density Distribution information As object pixel density, the small image object picture element density of target afocal fog-level is corresponded to less than it, to realize in Jiao Clearly, on the basis of the shallow depth image effect that afocal obscures, afocal fog-level is further refined so that image imaging regards Feel that effect is more vivid, meet diversified practical application request.
The acquisition modes of the target afocal fog-level information are unrestricted, such as can be determined by user, alternatively, can be by The depth information of scene to be taken the photograph determines, alternatively, can be predefined by imaging device, etc..In a kind of optional implementation, institute Stating target afocal fog-level information includes:At least part afocal object point of the scene to be taken the photograph in described image sensor extremely The blur circle distributed intelligence of small part imaging point;Under the situation, described image object pixel Density Distribution information is determined, including: Described image object pixel Density Distribution information is determined according to the blur circle distributed intelligence of at least partly described imaging point.Determine institute The specific implementation for stating blur circle distributed intelligence is unrestricted.Determine that the blur circle distributed intelligence may include:Determine described Blur circle (circle of of at least afocal object point for scene to be taken the photograph in an at least imaging point for described image sensor Confusion) information;According to the distance of other afocal object points of at least part of the scene to be taken the photograph and focal plane and definite The blur circle information of at least one imaging point, determines the blur circle information of at least partly other imaging points;According to definite each The blur circle information determines the blur circle distributed intelligence of at least partly described imaging point.For example, with reference to figure 3, user can determine that One or more afocal object points are straight in the blur circle of one or more imaging points of described image sensor in the scene to be taken the photograph Footpath;According to the disperse circular diameter for the corresponding imaging point of these object points that user determines, these object points object distance, it is known described in wait to take the photograph The focal length of the camera lens of scene and the object distance of focal plane, the desired void of user is can determine that according to blur circle diameter calculation formula Intend f-number N, can be according to following disperse circular diameter afterwards with reference to the object distance of the one or more object points of other in the scene to be taken the photograph Calculation formula determines blur circle of the one or more object points of other in the scene to be taken the photograph in the corresponding imaging point of imaging sensor Diameter:
D=f/N ((U1(U2-f))/(U2(U1-f))-1)
In above formula, f represents the focal length of camera lens, U1Represent the object distance of focal plane, U2Represent the object point of blur circle to be calculated Object distance, N represent the desired virtual aperture value of user, and d represents U2Disperse of the corresponding object point in the corresponding imaging point of imaging sensor Circular diameter.
After obtaining the disperse circular diameter of the corresponding imaging point of each object point, you can determine the figure according to blur circle distributed intelligence As object pixel Density Distribution information., can be according to actual needs when the scope of definite different blur circles has necessarily overlapping Determine the image object picture element density of blur circle overlapping region.One optional scene, with reference to figure 4, three afocal object points are described The blur circle of three imaging points of imaging sensor is expressed as A, B and C, and the radius of three blur circles is sequentially increased, wherein, The image object picture element density in the small region of disperse circular diameter is more than the image object picture element density in the big region of disperse circular diameter, However, three blur circles have certain repetition, the image object picture that certain rule determines different zones can be followed under the situation Plain density, the rule may include but be not limited to big density priority rule, such as the intersection correspondence image object pixel of A and B or C Density is that the intersection correspondence image object pixel density of A corresponding image object picture element density a, B and C are B correspondence image targets Picture element density b.The program make it that the setting of afocal fog-level is more flexible.
The embodiment of the present application, can be according to the object pixel after described image object pixel Density Distribution information is obtained The picture element density distribution of Density Distribution information adjustment imaging sensor, showing before the distribution adjustment of described image sensor picture element density It is intended to as shown in Figure 5 a, as shown in Figure 5 b, the picture element density of described image sensor part is big after adjustment for the schematic diagram after adjustment And local picture element density is small.In practical application, the adjustment mode being distributed to the picture element density of described image sensor can basis Selection is actually needed, the embodiment of the present application is not intended to limit this.A kind of optional implementation, can be close according to the object pixel Degree distributed intelligence determines the shape control information in controllable deforming material portion;The controllable change is controlled according to the shape control information Shape material portion deforms upon, and the pixel that described image sensor is accordingly adjusted with the deformation by the controllable deforming material portion is close Degree distribution.The program adjusts the pixel distribution of described image sensor, scheme by controlling the deformation in controllable deforming material portion It is simple easily to realize.
The controllable deforming material portion is that certain external action factor (such as outfield) by varying effect thereon can make it Deform upon, when cancelling or change in the outfield of effect thereon, the deformation in the controllable deforming material portion can recover.
Fig. 1 b provide a kind of structure diagram of the adjustable imaging sensor of picture element density for the embodiment of the present application.Such as Fig. 1 b Shown, the adjustable imaging sensor of picture element density provided by the embodiments of the present application includes:Multiple images sensor pixel 11 and one Controllable deforming material portion 12, wherein, imaging sensor carries out Image Acquisition, multiple images sensing by image sensor pixel 11 Device pixel 11 is in array distribution, and controllable deforming material portion 12 is connected with multiple images sensor pixel 11 respectively;Controllable deforming material Material portion 12 can deform upon under outer field action and accordingly adjust multiple images sensing by the deformation in controllable deforming material portion 12 The Density Distribution of device pixel 11.
In technical solution provided by the embodiments of the present application, the controllable deforming material portion is by varying the controllable deforming material Certain outer field action factor in material portion can bring it about deformation, when certain outer field action factor cancels or changes, the controllable deforming The deformation in material portion can recover, and the outfield can be directed to the deformation behavior selection index system in the controllable deforming material portion thereon Corresponding control outfield, such as the outfield includes but not limited to external electrical field, magnetic field, light field etc..Image sensor pixel It may include but be not limited to an at least photoelectric conversion unit.Can be used between each image sensor pixel and controllable deforming material portion but It is not limited to the modes such as bonding closely to be connected, in this way, when the controllable deforming material portion deforms upon, will accordingly adjusts respectively Spacing between image sensor pixel, thus changes the Density Distribution of image sensor pixel, reaching can be according to actual needs Assign the effect that imaging sensor different zones are distributed with differentiation picture element density.
In practical application, different zones of the outer field action of uneven distribution in the controllable deforming material portion can be made Obtain the controllable deforming material portion different piece region and different degrees of deformation occurs, thus adjust the whole of image sensor pixel Volume density is distributed.Optionally, can be by the outer field action in the controllable deforming material portion and multiple described image sensor pictures The nonoverlapping region of element, so may be such that the controllable deforming material portion region overlapping with described image sensor pixel is not sent out Raw deformation, but the density of image sensor pixel point is changed by the deformation of the other parts in the controllable deforming material portion Cloth, the program are damaged caused by being conducive to avoid deformation because of controllable deforming material portion described image sensor pixel.
In practical application, suitable at least one controllable deforming material can be selected to prepare the controllable deforming as needed Material portion, so that the controllable deforming material portion has the deformable and recoverable characteristic of deformation.Optionally, the controllable deforming Material portion is at least prepared by one or more of controllable deforming material:Piezoelectric, electroactive polymer, photo-deformable Material, magnetostriction materials.
The piezoelectric can produce mechanically deform because of electric field action.The controllable deforming prepared using the piezoelectric Material portion hereinafter referred to as piezoelectric material.Using this physical characteristic of the piezoelectric, the embodiment of the present application can according to but The object pixel Density Distribution information is not limited to determine for making piezoelectric material that the electric field needed for corresponding mechanical deformation occur Control information, according to the electric field controls information control action piezoelectric material electric field so that piezoelectric material hair Raw corresponding mechanical deformation, the picture element density point of imaging sensor is accordingly adjusted by the mechanical deformation of the piezoelectric material Cloth, thus reaches the mesh for adjusting the picture element density of described image sensor according to the object pixel Density Distribution information and being distributed 's.The piezoelectric may include but be not limited at least one of:Piezoelectric ceramics, piezo-electric crystal.The program can make full use of The physical characteristic of piezoelectric is distributed to adjust the picture element density of imaging sensor.
The electroactive polymer (Electroactive Polymers, abbreviation EAP) is that one kind can be in electric field action The lower polymeric material for changing its shape or size.Using the electroactive polymer prepare controllable deforming material portion hereinafter referred to as For electroactive polymer portion.Using this physical characteristic of the electroactive polymer, the embodiment of the present application can according to but it is unlimited Determined in the object pixel Density Distribution information for making electroactive polymer portion that the electric field controls needed for corresponding deformation occur Information, according to the electric field controls information control action layer of electroactive polymer electric field so that the electroactive polymer Corresponding deformation occurs for layer, and the picture element density point of imaging sensor is accordingly adjusted by the deformation of the layer of electroactive polymer Cloth, thus reaches the mesh for adjusting the picture element density of described image sensor according to the object pixel Density Distribution information and being distributed 's.The electroactive polymer may include but be not limited at least one of:Electron type electroactive polymer, ionic electroactive Polymer;The electron type electroactive polymer includes at least one of:Ferroelectric polymer (such as Kynoar), electricity Cause flexible grafted elastomeric, liquid crystal elastic body;The ionic electroactive polymer includes at least one of:ER fluid, Ion polymer-metal composite material etc..The program can make full use of the physical characteristic of electroactive polymer to be passed to adjust image The picture element density distribution of sensor.
The photo-deformable material is a kind of high molecular material that can change its shape or size under light field effect.Adopt The hereinafter referred to as photo-deformable material portion of controllable deforming material portion prepared with the photo-deformable material.Utilize the photo-deformable This physical characteristic of material, the embodiment of the present application can according to but be not limited to the object pixel Density Distribution information determine it is photic Light field control information needed for corresponding deformation occurs for shape-changing material portion, according to the light field control information control action in the light Cause the light field in shape-changing material portion so that corresponding deformation occurs for the photo-deformable material portion.Pass through the photo-deformable material The deformation in portion accordingly adjusts the picture element density distribution of imaging sensor, thus reaches according to the object pixel Density Distribution information Adjust the purpose of the picture element density distribution of described image sensor.The photo-deformable material may include but be not limited to it is following at least One of:Photo-induced telescopic ferroelectric ceramics, photo-deformable polymer;The photo-induced telescopic ferroelectric ceramics includes but not limited to lead zirconate titanate Lanthanum (PLZT) ceramics, photo-deformable polymer include but not limited to photo-deformable liquid crystal elastomer).The program can make full use of light The physical characteristic of shape-changing material is caused to be distributed to adjust the picture element density of imaging sensor.
The magnetostriction materials are that one kind can change its magnetized state under magnetic fields, and then its size occurs The magnetic material of change.The controllable deforming material portion hereinafter referred to as magnetostriction material prepared using the magnetostriction material Portion.Using this physical characteristic of the magnetostriction materials, the embodiment of the present application can according to but be not limited to the object pixel Density Distribution information determines that the magnetic field control information needed for corresponding deformation occurs for magnetostriction materials, is controlled and believed according to the magnetic field Control action is ceased in the magnetic field in the magnetostriction material portion so that corresponding deformation occurs for the magnetostriction material portion.It is logical The deformation for crossing the magnetostriction material portion accordingly adjusts the picture element density distribution of imaging sensor, thus reaches according to the mesh Mark the purpose of the picture element density distribution of picture element density distributed intelligence adjustment described image sensor.The magnetostriction material can wrap Include but be not limited to rare earth ultra-magnetostriction material, such as with (Tb, Dy) Fe2Compound is the alloy Tbo of matrix0.3Dy0.7Fe1.95Material Material etc..The program can make full use of the physical characteristic of magnetostriction material to be distributed to adjust the picture element density of imaging sensor.
In technical solution provided by the embodiments of the present application, the specific knot in each image sensor pixel and controllable deforming material portion Structure and connection mode can determine that practical ways are very flexible according to being actually needed.
A kind of optional implementation, as shown in Figure 1 b, the controllable deforming material portion 12 include:One controllable deforming material Layer 121, multiple 11 array distributions of described image sensor pixel and is connected to the one side of the controllable deforming material layer 121.Can Choosing, it can be selected multiple described image sensor pixels being formed directly into the controllable deforming material according to actual process condition On layer 12, alternatively, multiple described image sensor pixels can be prepared respectively with the controllable deforming material layer 12 and the two can be adopted With but be not limited to bonding mode closely connect.The program is simple in structure, easy realization.
Another optional implementation, as illustrated in figure 1 c, the controllable deforming material portion 12 includes multiple controllable deformings Material connects sub-portion 122, multiple controllable deforming material connection 122 array distributions of sub-portion, with corresponding connection array distribution Multiple described image sensor pixels of multiple described image sensor pixels 11, i.e. array distribution pass through the multiple of array distribution The controllable deforming material connection sub-portion is connected as one.Optionally, can be according to actual process in image sensor pixel array The interval region of pixel form multiple controllable deforming materials connection sub-portions, multiple controllable deforming materials connection sub-portions It the mode such as can use but be not limited to abut, is Nian Jie connect with respective image sensor pixel.It is multiple described controllable by controlling The deformation of deformable material connection sub-portion is the Density Distribution of adjustable image sensor pixel, simple in structure, is easily realized.
Further, as shown in figs. 1 d and 1e, described image sensor may also include:Shape control portion 13, shape control Portion 13 is used for distribution of the adjustment effect to the outfield in the controllable deforming material portion 12, to control the controllable deforming material Corresponding deformation occurs for portion 12, in this way, when the controllable deforming material portion 12 deforms upon, will accordingly adjust each image sensing Spacing between device pixel 11, thus changes the Density Distribution of image sensor pixel 11, and reaching can assign according to being actually needed The effect that imaging sensor different zones are distributed with differentiation picture element density.
Optionally, as shown in Figure 1 d, the shape control portion may include light field control unit 131, and light field control unit 131 is used for Adjustment effect to the controllable deforming material portion 12 exterior optical field distribution, to control the controllable deforming material portion 12 that phase occurs The deformation answered.Under the situation, the controllable deforming material portion 12 may include at least to be prepared by photo-deformable material photic Shape-changing material portion, such as the photo-deformable material portion may include the photo-deformable being at least prepared by the photo-deformable material Material layer, alternatively, the controllable deforming material portion may include at least to be prepared by the photo-deformable material it is multiple photic Shape-changing material connects sub-portion.Light field control unit 131 is by varying acting on optical field distribution (Fig. 1 d in the photo-deformable material portion In by arrow density represent to act on the light field that 12 varying strength of controllable deforming material portion is distributed), come encourage it is described can Different degrees of deformation occurs for the different zones for controlling deformable material portion 12, and passes through the deformation phase in the controllable deforming material portion 12 The spacing between each image sensor pixel 11 is answered, thus changes the Density Distribution of image sensor pixel 11, reaching can basis It is actually needed and assigns the effect that imaging sensor different zones are distributed with differentiation picture element density.
Optionally, as shown in fig. le, the shape control portion may include electric field controls portion 132, and electric field controls portion 132 is used for The external electrical field in adjustment effect to the controllable deforming material portion is distributed, corresponding to control the controllable deforming material portion to occur Deformation.Under the situation, the controllable deforming material portion 12 may include the piezoelectric material being at least prepared by piezoelectric (such as Piezoelectric material layer or piezoelectric connection sub-portion, etc.), alternatively, the controllable deforming material portion 12 may include at least by electricity Electroactive polymer portion that living polymer is prepared (such as layer of electroactive polymer or electroactive polymer connection sub-portion, Etc.).As shown in fig. le, electric field controls portion and controllable deforming material can be connected by control line, electric field controls portion 132 is by changing Become the electric field distribution for acting on the controllable deforming material portion, to encourage the different zones in the controllable deforming material portion 12 to occur Different degrees of deformation.If it is zero electric field to act on 12 electric field of controllable deforming material portion, the controllable deforming material Portion does not deform upon and (might as well be known as zero electric field excitation);If change the electric field strong and weak for acting on the controllable deforming material portion 12 It is distributed (field excitation of "+" positive electricity and the field excitation of "-" negative electricity as illustrated in the drawing) so that act on the controllable deforming material portion 12 The electric field strength of different zones difference, as shown in Figure 1 f, in this way, the different zones in the controllable deforming material portion can occur Different degrees of deformation, and accordingly adjusted between each image sensor pixel 11 by the deformation in the controllable deforming material portion 12 Spacing, thus change imaging sensor whole pixel Density Distribution, reach can according to be actually needed assign imaging sensor The effect that different zones are distributed with differentiation picture element density.
Controllable deforming portion described in the embodiment of the present application can be directly connected to shape control portion, can be also indirectly connected with.It is described Shape control portion can be as a part for described image sensor, alternatively, the shape control portion also can be not as described image A part for sensor, described image sensor also can the mode such as reserved pin, interface be connected with the shape control portion.Effect Outfield in the controllable deforming material portion may include but be not limited to electric field, magnetic field, light field etc..For producing the hard of electric field Part, software configuration, can for producing the hardware in magnetic field, software configuration and hardware, software configuration for producing light field etc. Realize that details are not described herein for the embodiment of the present application using the corresponding prior art according to being actually needed.
Optionally, described image sensor may also include flexible substrate, and the flexible substrate may include but be not limited to flexibility Plastic supporting base, it can change the shape of flexible substrate as needed with certain flexibility.Image sensor pixel, controllable change Shape material portion can set the homonymy or not homonymy of flexible substrate.Such as:As shown in Figure 1 g, multiple described image sensor pixels 11 connect The one side of flexible substrate 14 is connected to, controllable deforming material portion (such as controllable deforming material layer 121) is connected to the another of flexible substrate 14 Simultaneously.In another example:As shown in figure 1h, multiple described image sensor pixels 11 are connected to the one side of flexible substrate 14, controllable change The corresponding image sensor pixel of shape material portion (such as controllable deforming material connection sub-portion 122) connection and with the image sensor Pixel 11 is located at the same face of the flexible substrate 14.The program not only can be by acting on the outfield in controllable deforming material portion Control it to deform upon the whole pixel Density Distribution for carrying out Indirect method imaging sensor, realize the picture degree density of imaging sensor It is adjustable, such as the imaging sensor of plane can also be bent because flexibly changing the shape of imaging sensor which employs flexible substrate Thus certain angle meets the application demands such as variety of images collection, decoration to obtain the imaging sensor of curved surface.
Fig. 1 i provide the structure diagram of the 7th kind of adjustable imaging sensor of picture element density for the embodiment of the present application.Such as figure In imaging sensor shown in 1i, the controllable deforming material portion 12 includes:Flexible substrate 123 and multiple permeability magnetic material portions 124; Multiple images sensor pixel 11 is connected with flexible substrate 123 respectively, is connected with least part image sensor pixel 11 more A permeability magnetic material portion 124, makes flexible substrate 123 that corresponding deformation, simultaneously occur by varying the magnetic field for acting on permeability magnetic material portion 124 The Density Distribution of multiple described image sensor pixels 11 is accordingly adjusted by the deformation.Such as:Can be in each image sensing The side of device pixel sets a permeability magnetic material portion 124, and optionally, image sensor pixel 11 is respectively with flexible substrate 123 and leading Magnetic material portion 124 is bonded.The permeability magnetic material portion may include permeability magnetic material prepare magnetic pole, the permeability magnetic material can with but it is unlimited One in using soft magnetic material, silicon steel sheet, permalloy, ferrite, amorphous soft magnetic alloy, super-microcrystalline soft magnetic alloy etc. Kind is a variety of.The permeability magnetic material portion magnetic property prepared using soft magnetic material work is preferable, remanent magnetism very little after the revocation of magnetic field Easy to adjust next time.
Further, optionally, the shape control portion 13 described in the embodiment of the present application may also include:Magnetic field control unit 133, magnetic The external magnetic field that field control unit 133 is used for adjustment effect to the controllable deforming material portion is distributed, to control the controllable deforming Corresponding deformation occurs for material portion.For example, the magnetic field when 133 control action of magnetic field control unit in permeability magnetic material portion 124 (is swashed Exciting field) when changing, apply the same of certain magnetic field strength distribution between neighboring image sensors pixel as shown in figure 1i Magnetic pole (NN or SS) repels magnetic field or different pole (NS or SN) attracts magnetic field, repulsive force or attraction can be accordingly produced between magnetic pole Power, which, which is delivered to flexible substrate 123, makes flexible substrate 123 that the deformation such as flexible occur, and then causes respective image to pass Spacing between sensor pixel changes, and realizes the purpose of adjustment image sensor pixel Density Distribution.The program combines soft Property substrate deformation behavior and the magnetic field control principle such as scalable, realize that the picture element density distribution on imaging sensor is adjustable.
Fig. 1 j provide the structure diagram of the 8th kind of adjustable imaging sensor of picture element density for the embodiment of the present application.Such as figure In imaging sensor shown in 1j, the controllable deforming material portion 12 includes:Flexible substrate 123 and multiple permeability magnetic material portions 124; The one side in multiple permeability magnetic material portions 124 is connected with the flexible substrate 123 respectively, multiple permeability magnetic material portions 124 it is opposite Face is connected respectively multiple described image sensor pixels 11, by varying the magnetic field in the permeability magnetic material portion 124 is acted on Make the flexible substrate 11 that corresponding deformation occur and multiple described image sensor pixels 11 are accordingly adjusted by the deformation Density Distribution.Optionally, permeability magnetic material portion 124 is be bonded with flexible substrate 123, image sensor pixel 11 and permeability magnetic material portion 124 bondings, when flexible substrate 123 occurs when acting on the magnetic field in permeability magnetic material portion 124 and changing, magneticaction transmission Make flexible substrate 123 that flexible wait occur to flexible substrate 123 to deform, and then realize adjustment image sensor pixel Density Distribution Purpose.Deformation behavior and the magnetic field control principle such as scalable of program combination flexible substrate, are realized on imaging sensor Picture element density distribution is adjustable.
After picture element density distribution adjustment being carried out to described image sensor according to the object pixel Density Distribution information, into The Image Acquisition of scene to be taken the photograph described in row, in image acquisition process, each image sensor pixel of described image sensor is equal It take part in Image Acquisition.Since the picture element density distribution of described image sensor is according to described image object pixel density point Cloth information is adjusted, and described image object pixel Density Distribution information is the target depth of field letter of the scene to be taken the photograph according to Breath determines, therefore, according to the image of scene to be taken the photograph described in the described image sensor acquisition after adjustment, the difference of the image of acquisition The clarity in region shows difference distribution corresponding with object pixel Density Distribution information, and relatively described target depth of view information needs The part clearly to present has more pixels and participates in Image Acquisition, the image definition higher of this part, and details is more rich Richness, improves Image Acquisition efficiency, and the part that the target depth of view information need not be clearly presented is then with relatively small number of pixel Image Acquisition is participated in, the image of the part is more fuzzy, thus makes full use of the pixel of imaging sensor to realize shallow scape on the whole Deep image effect, can more preferably meet the diversified application demand of user.
, can after the embodiment of the present application carries out Image Acquisition according to the described image sensor after picture element density distribution adjustment The image that scanning output collects, such as:The pixel index information of described image sensor can be obtained;According to the pixel rope Draw the described image that information scanning output obtains.Wherein, the pixel index information of described image sensor includes:Described image passes Sensor carries out the original position-information of each image sensor pixel before picture element density distribution adjustment.Using certain scan mode (progressively scan, scan by column, interlacing scan, etc.) carries out image scanning output by the pixel index information, due to described There are one for the index information of the actual position information of pixel and respective pixel during Image Acquisition is carried out for imaging sensor Determine deviation, therefore, the image obtained by the image of pixel index information progress image scanning output relative to acquired original For be displaying ratio exception deformation pattern.The deformation pattern is for the image that acquired original obtains, corresponding picture Size in the image that acquired original obtains is big relative to it for size of the big region of plain density in the image of the deformation, example Such as:It is scanned to export what is obtained under the situation based on the region collection head picture part that described image sensor picture element density is larger Head portrait described in deformation pattern is partially larger than head portrait part described in the image that acquired original obtains, in this way, user can be more Conveniently see the part that its needs is paid close attention to, improve the efficiency of image shows, improve the visual experience of user.
The constant non-deformed image of opposite preview image display scale is obtained if desired, is passed according to described image after adjustment The described image that the picture element position information scanning output of sensor obtains, obtains a normal displaying ratio corresponding with preview image Recover image.
It will be understood by those skilled in the art that in any of the above-described method of the application embodiment, each step Sequence number size is not meant to the priority of execution sequence, and the execution sequence of each step should be determined with its function and internal logic, and Any restriction should not be formed to the implementation process of the application embodiment.
Fig. 6 is the logic diagram of the first imaging control apparatus provided by the embodiments of the present application.As shown in fig. 6, the application A kind of imaging control apparatus that embodiment provides includes:One target depth of view information determining module 61, an object pixel Density Distribution Information determination module 62, picture element density adjustment 63 and one image collection module 64 of module.
Target depth of view information determining module 61 is used for the target depth of view information for determining scene to be taken the photograph.
Object pixel Density Distribution information determination module 62 is used to determine image object picture according to the target depth of view information Plain Density Distribution information.
Picture element density adjustment module 63 is used to adjust imaging sensor according to described image object pixel Density Distribution information Picture element density distribution.
Image collection module 64 be used for it is adjusted after described image sensor shooting described in scene to be taken the photograph image.
Each pixel of the embodiment of the present application imaging sensor described in image acquisition process both participates in Image Acquisition.Due to The picture element density distribution of described image sensor is adjusted according to described image object pixel Density Distribution information, and Described image object pixel Density Distribution information is that the target depth of view information of the scene to be taken the photograph according to determines, therefore, according to tune The image of scene to be taken the photograph, the clarity of the different zones of the image of acquisition show described in described image sensor acquisition after whole Difference distribution corresponding with object pixel Density Distribution information, relatively described target depth of view information need the part meeting clearly presented There are more pixels to participate in Image Acquisition, the image definition higher of this part improves Image Acquisition efficiency, and the target Depth of view information then participates in Image Acquisition, the image more mould of the part without the part clearly presented with relatively small number of pixel Paste, thus makes full use of the pixel of imaging sensor to realize the image effect of the shallow depth of field, can more preferably meet that user is various on the whole The application demand of change.
The equipment form of expression of the imaging control apparatus is unrestricted, such as the imaging control apparatus can be a certain only Vertical component, the component match somebody with somebody hop communication with including the imaging device of imaging sensor;Alternatively, the imaging control apparatus can be made It is integrated in for a certain function module in an imaging device for including imaging sensor, the embodiment of the present application is not intended to limit this.
Optionally, as shown in fig. 7, the target depth of view information determining module 61 includes:One depth of field determines acquisition of information 611 and one first object depth of view information determination sub-module 612 of module.The depth of field determines that acquisition of information submodule 611 is used to obtain scape Deep definite information;First object depth of view information determination sub-module 612 is used to determine that information is waited to take the photograph described in determining according to the depth of field The target depth of view information of scene.The program makes the definite very flexible of the target depth of view information, more preferable to meet variation Practical application request.
Optionally, the target depth of view information determining module 61 includes:One object point Depth Information Acquistion submodule 613 and one Second target depth of view information submodule 614.Object point Depth Information Acquistion submodule 613 is used for the object point for obtaining the scene to be taken the photograph Depth information;Second target depth of view information submodule 614 is used for the scene to be taken the photograph according to determining the object point depth information The target depth of view information.The program advantageously allows the definite more convenient and accurate of the target depth of view information.
Optionally, as shown in figure 8, the target depth of view information determining module 61 includes:One target depth of field determination sub-module 615;The object pixel Density Distribution information determination module 62 includes:First object picture element density distributed intelligence determination sub-module 621.Target depth of field determination sub-module 615 is used to determine the target depth of field of the scene to be taken the photograph with respect to focal plane;First object picture Plain Density Distribution information determination sub-module 621 is used to determine that described image object pixel Density Distribution is believed according to the target depth of field Breath.The program can obtain the picture that the shallow Deep Canvas that clear, afocal obscures in Jiao is visually presented.
Optionally, the target depth of view information determining module 61 includes:One range information determination sub-module 616;The mesh Mark picture element density distributed intelligence determining module 62 includes:One second object pixel Density Distribution information determination sub-module 622.Distance Information determination sub-module 616 is used to determine at least part afocal object point of the scene to be taken the photograph to the range information of focal plane;The Two object pixel Density Distribution information determination sub-modules 622 are used to determine that described image object pixel is close according to the range information Spend distributed intelligence.The program, which can obtain, is visually presented that more closely objective point imaging is more clear, more remote objective point imaging more mould The shallow depth image effect of paste.
Optionally, the imaging control apparatus further includes:One focal plane determining module 65.Focal plane determining module 65 is used for Determine the focal plane of the scene to be taken the photograph.The program can be according to being actually needed the focal plane that determines the scene to be taken the photograph, reality Existing mode is very flexible.
Further, optionally, the focal plane determining module 65 includes:One region of interest determines acquisition of information submodule 651 and one first focal plane determination sub-module 652.Region of interest determines that acquisition of information submodule 651 is used to obtain region of interest Determine information;First focal plane determination sub-module 652 is used to determine that information determines the focal plane according to the region of interest.Should The focal plane of scheme scene to be taken the photograph according to determining the region of interest so that the focal plane determines to need with actual user Ask more identical, can more preferably meet the application demand of user individual.
Optionally, the focal plane determining module 65 includes:One graphical analysis submodule 653 and one second focal plane determine Submodule 654.Graphical analysis submodule 653 is used to analyze the preview graph to the scene to be taken the photograph in described image sensor Picture;Second focal plane determination sub-module 654 is used to determine the focal plane according to image analysis result.The program can be according to described The image analysis result of preview image determines the focal plane of the scene to be taken the photograph so that the focal plane of the scene to be taken the photograph determines It is more intelligent, improve efficiency and universality that the focal plane determines
Optionally, as shown in figure 9, the target depth of view information determining module 61 includes:One target afocal fog-level is believed Cease determination sub-module 617;The object pixel Density Distribution information determination module 62 includes:One the 3rd object pixel Density Distribution Information determination sub-module 623.Target afocal fog-level information determination sub-module 617 is used for the target for determining the scene to be taken the photograph Afocal fog-level information;3rd object pixel Density Distribution information determination sub-module 623 is used for according to the target afocal mould Paste degree information determines described image object pixel Density Distribution information.The program can refine afocal fog-level so that image The visual effect of imaging is more vivid, meets diversified practical application request.
Optionally, the target afocal fog-level information determination sub-module 617 includes:One blur circle distributed intelligence determines Unit 6171;The 3rd object pixel Density Distribution information determination sub-module includes 623:One object pixel Density Distribution information Determination unit 6231.Blur circle distributed intelligence determination unit 6171 is used at least part afocal object point for determining the scene to be taken the photograph In the blur circle distributed intelligence of at least part imaging point of described image sensor;Object pixel Density Distribution information determination unit 6231 are used to determine that described image object pixel Density Distribution is believed according to the blur circle distributed intelligence of at least partly described imaging point Breath.Further, the blur circle distributed intelligence determination unit 6231 may include:One first blur circle information determination subelement 62311st, one second blur circle information determination subelement 62312 and a blur circle distributed intelligence determination subelement 62313.First more Dissipate circle information determination subelement 62311 and be used for an at least afocal object point for the definite scene to be taken the photograph in described image sensor The blur circle information of an at least imaging point;Second blur circle information determination subelement 62312 is used for the scene to be taken the photograph according to At least partly distance of other afocal object points and focal plane and the blur circle information of at least one definite imaging point, determine At least partly blur circle information of other imaging points;Blur circle distributed intelligence determination subelement 62313 is used for each according to what is determined The blur circle information determines the blur circle distributed intelligence of at least partly described imaging point.Jiao is determined based on blur circle distributed intelligence The scheme of outer fog-level make it that the setting of the afocal fog-level is more flexible.
Optionally, as shown in Figure 10, the picture element density adjustment module 63 includes:One shape control information determination sub-module 631 and a shape control module 632.Shape control information determination sub-module 631 is used for according to described image object pixel density Distributed intelligence determines the shape control information in controllable deforming material portion;Shape control module 632 is used for according to the shape control Information controls the controllable deforming material portion to deform upon, with by described in the deformation in the controllable deforming material portion accordingly adjustment The picture element density distribution of imaging sensor.
Optionally, the imaging control apparatus further includes:One pixel index data obtaining module 65 and one first image are defeated Go out module 66.Pixel index data obtaining module 65 is used for the pixel index information for obtaining described image sensor;First image Output module 66 is used for the described image obtained according to pixel index information scanning output.By the pixel index information into The image of row image scanning output is the deformation pattern of displaying ratio exception for the image that acquired original obtains, just See its part for needing to pay close attention in user, improve the efficiency of image shows, improve the visual experience of user.
Optionally, the imaging control apparatus further includes:One second image output module 67.Second image output module 67 For the described image obtained according to the picture element position information scanning output of described image sensor after adjustment.The program can obtain The allochoric normal displaying ratio image of poor definition.
Figure 11 plants the logic diagram of imaging control apparatus, the application specific embodiment for stream provided by the embodiments of the present application The specific implementation of imaging control apparatus 1100 is not limited.As shown in figure 11, imaging control apparatus 1100 can wrap Include:
Processor (Processor) 1110, communication interface (Communications Interface) 1120, memory (Memory) 1130 and communication bus 1140.Wherein:
Processor 1110, communication interface 1120 and memory 1130 complete mutual lead to by communication bus 1140 Letter.
Communication interface 1120, for such as having the equipment of communication function, external light source etc. to communicate.
Processor 1110, for executive program 1132, can specifically perform in any of the above-described image formation control method embodiment Correlation step.
For example, program 1132 can include program code, said program code includes computer-managed instruction.
Processor 1110 is probably a central processing unit (Central Processing Unit, abbreviation CPU), or It is specific integrated circuit (Application Specific Integrated Circuit, abbreviation ASIC), or is configured Into the one or more integrated circuits for implementing the embodiment of the present application.
Memory 1130, for storing program 1132.Memory 1130 may include random access memory (Random Access Memory, abbreviation RAM), it is also possible to nonvolatile memory (Non-volatile memory) is further included, such as extremely A few magnetic disk storage.
For example, in an optional implementation manner, processor 1110 can perform following steps by executive program 1132: Determine the target depth of view information of scene to be taken the photograph;Image object picture element density distributed intelligence is determined according to the target depth of view information; The picture element density that imaging sensor is adjusted according to described image object pixel Density Distribution information is distributed;The figure after adjusted The image of scene to be taken the photograph as described in sensor shooting.
In other optional implementations, processor 1110 can also carry out that above-mentioned other are any by executive program 1132 The step of embodiment refers to, details are not described herein.
In program 1132 specific implementation of each step may refer to corresponding steps in above-described embodiment, module, submodule, Corresponding description in unit, details are not described herein.It is apparent to those skilled in the art that the convenience for description With it is succinct, the equipment of foregoing description and the specific work process of module, may be referred to the corresponding process in preceding method embodiment Description, details are not described herein.
Figure 12 is a kind of logic diagram of imaging device provided by the embodiments of the present application.As shown in figure 12, the application is implemented A kind of imaging device 120 that example provides includes an imaging sensor 121 and an imaging control apparatus 122, the imaging control dress Put 122 and be connected connection with described image sensor 121.The structure of the imaging control apparatus, operation principle description can be found in The record of the corresponding embodiment of text, details are not described herein.Described image sensor includes flexible substrate and in the flexible liner The multiple images sensor pixel formed on bottom.The imaging device may include but be not limited to have and take pictures, photographs, imaging, regard The equipment of the image collecting functions such as frequency monitoring, may be, for example, but be not limited to following equipment:Camera, mobile phone, camera, photography Machine, video recorder, etc..
Technical solution provided by the embodiments of the present application, since the picture element density of imaging sensor is distributed according to image mesh Mark picture element density distributed intelligence is adjusted, and described image object pixel Density Distribution information is the scene to be taken the photograph according to Target depth of view information determine, therefore, according to after adjustment described image sensor obtain described in scene to be taken the photograph image, obtain The clarity of different zones of image show difference distribution corresponding with object pixel Density Distribution information, relatively described mesh The part that mark depth of view information needs clearly to present has more pixels and participates in Image Acquisition, and the image definition of this part is more Height, improves Image Acquisition efficiency, and the part that the target depth of view information need not be clearly presented is then with relatively small number of pixel Image Acquisition is participated in, the image of the part is more fuzzy, thus makes full use of the pixel of imaging sensor to realize shallow scape on the whole Deep image effect, can more preferably meet the diversified application demand of user.
Optionally, described image sensor includes:The multiple images sensor pixel of array distribution;One controllable deforming material Portion, is connected with multiple described image sensor pixels respectively;The controllable deforming material portion can deform upon under outer field action, And the Density Distribution of multiple described image sensor pixels is accordingly adjusted by the deformation;The outfield is controlled by the imaging Device controls.
Structure in relation to described image sensor can be found in the corresponding record of Fig. 1 b- Fig. 1 j, and the imaging control apparatus can The outfield is directly controlled to control the deformation in the controllable deforming material portion, and then the pixel for adjusting described image sensor is close Degree distribution;Alternatively, the imaging control apparatus can be by controlling the shape control portion come indirect control outfield so that it is described can Corresponding deformation occurs for control deformable material portion accordingly to adjust the distribution of the picture element density of described image sensor;Etc..Described image Sensor pixel and the physical connection mode in the deformable material portion, can determine according to being actually needed, as long as meeting in the change Shape material portion can adjust when deforming upon described image sensor picture element density distribution, the embodiment of the present application to this not Limitation, specific implementation can be found in corresponding record above, and details are not described herein.
In the application the various embodiments described above, the sequence number and/or sequencing of embodiment are merely convenient of description, do not represent reality Apply the quality of example.Description to each embodiment all emphasizes particularly on different fields, and does not have the part being described in detail in some embodiment, may refer to it The associated description of his embodiment.The associated description of implementation principle or process in relation to device, equipment or system embodiment, reference can be made to The record of correlation method embodiment, details are not described herein.
Those of ordinary skill in the art may realize that each exemplary list described with reference to the embodiments described herein Member and method and step, can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually Performed with hardware or software mode, application-specific and design constraint depending on technical solution.Professional technician Described function can be realized using distinct methods to each specific application, but this realization is it is not considered that exceed Scope of the present application.
If the function is realized in the form of SFU software functional unit and is used as independent production marketing or in use, can be with It is stored in a computer read/write memory medium.Based on such understanding, the technical solution of the application is substantially in other words The part to contribute to the prior art or the part of the technical solution can be embodied in the form of software product, the meter Calculation machine software product is stored in a storage medium, including some instructions are used so that a computer equipment (can be People's computer, server, or network equipment etc.) perform each embodiment the method for the application all or part of step. And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (Read-Only Memory, abbreviation ROM), deposit at random Access to memory (Random Access Memory, abbreviation RAM), magnetic disc or CD etc. are various can be with Jie of store program codes Matter.
In the embodiments such as the device of the application, method, system, it is clear that each component (system, subsystem, module, submodule Block, unit, subelement etc.) or each step can decompose, combine and/or decompose after reconfigure.These decompose and/or again Combination nova should be regarded as the equivalents of the application.Meanwhile in the description to the application specific embodiment above, for a kind of real The feature that the mode of applying is described and/or shown can be made in a manner of same or similar in one or more other embodiments With, it is combined with the feature in other embodiment, or substitute the feature in other embodiment.
It should be emphasized that term "comprises/comprising" refers to the presence of feature, key element, step or component when being used herein, but simultaneously It is not excluded for the presence or additional of one or more further features, key element, step or component.
Finally it should be noted that:Embodiment of above is merely to illustrate the application, and is not the limitation to the application, related The those of ordinary skill of technical field, in the case where not departing from spirit and scope, can also make a variety of changes And modification, therefore all equivalent technical solutions fall within the category of the application, the scope of patent protection of the application should be by right It is required that limit.

Claims (37)

  1. A kind of 1. image formation control method of shallow Deep Canvas image, it is characterised in that including:
    Determine the target depth of view information of scene to be taken the photograph;
    Image object picture element density distributed intelligence is determined according to the target depth of view information;
    The picture element density that imaging sensor is adjusted according to described image object pixel Density Distribution information is distributed;
    The image of scene to be taken the photograph described in described image sensor shooting after adjusted.
  2. 2. image formation control method according to claim 1, it is characterised in that determine the target scape of the scene to be taken the photograph Deeply convince breath, including:
    Obtain the depth of field and determine information;
    According to the depth of field determine information determine described in scene to be taken the photograph the target depth of view information.
  3. 3. image formation control method according to claim 1, it is characterised in that determine the target scape of the scene to be taken the photograph Deeply convince breath, including:
    The object point depth information of scene to be taken the photograph described in acquisition;
    The target depth of view information of scene to be taken the photograph according to determining the object point depth information.
  4. 4. according to any image formation control methods of claim 1-3, it is characterised in that
    The target depth of view information includes:The target depth of field of the scene to be taken the photograph with respect to focal plane;
    Determine described image object pixel Density Distribution information, including:Described image target picture is determined according to the target depth of field Plain Density Distribution information.
  5. 5. according to any image formation control methods of claim 1-3, it is characterised in that
    The target depth of view information includes:Range information of at least part afocal object point of the scene to be taken the photograph to focal plane;
    Determine described image object pixel Density Distribution information, including:Described image target picture is determined according to the range information Plain Density Distribution information.
  6. 6. image formation control method according to claim 4, it is characterised in that further include:
    Determine the focal plane of the scene to be taken the photograph.
  7. 7. image formation control method according to claim 5, it is characterised in that further include:
    Determine the focal plane of the scene to be taken the photograph.
  8. 8. image formation control method according to claim 6, it is characterised in that determine the described burnt flat of the scene to be taken the photograph Face, including:
    Obtain region of interest and determine information;
    Determine that information determines the focal plane according to the region of interest.
  9. 9. image formation control method according to claim 7, it is characterised in that determine the described burnt flat of the scene to be taken the photograph Face, including:
    Obtain region of interest and determine information;
    Determine that information determines the focal plane according to the region of interest.
  10. 10. image formation control method according to claim 6, it is characterised in that determine the described burnt flat of the scene to be taken the photograph Face, including:
    Analyze the preview image to the scene to be taken the photograph in described image sensor;
    The focal plane is determined according to image analysis result.
  11. 11. image formation control method according to claim 7, it is characterised in that determine the described burnt flat of the scene to be taken the photograph Face, including:
    Analyze the preview image to the scene to be taken the photograph in described image sensor;
    The focal plane is determined according to image analysis result.
  12. 12. according to any image formation control methods of claim 1-3, it is characterised in that
    The target depth of view information includes:Target afocal fog-level information;
    Determine described image object pixel Density Distribution information, including:Institute is determined according to the target afocal fog-level information State image object picture element density distributed intelligence.
  13. 13. image formation control method according to claim 12, it is characterised in that
    The target afocal fog-level information includes:At least part afocal object point of the scene to be taken the photograph is sensed in described image The blur circle distributed intelligence of at least part imaging point of device;
    Determine described image object pixel Density Distribution information, including:It is distributed according to the blur circle of at least partly described imaging point Information determines described image object pixel Density Distribution information.
  14. 14. image formation control method according to claim 13, it is characterised in that determine the blur circle distributed intelligence, wrap Include:
    Determine the blur circle letter of an at least imaging point of at least afocal object point for the scene to be taken the photograph in described image sensor Breath;
    According to other afocal object points of at least part of the scene to be taken the photograph and the distance of focal plane and definite at least one The blur circle information of imaging point, determines the blur circle information of at least partly other imaging points;
    The blur circle distributed intelligence of at least partly described imaging point is determined according to definite each blur circle information.
  15. 15. according to any image formation control methods of claim 1-3, it is characterised in that according to described image object pixel The picture element density distribution of Density Distribution information adjustment imaging sensor, including:
    The shape control information in controllable deforming material portion is determined according to described image object pixel Density Distribution information;
    The controllable deforming material portion is controlled to deform upon according to the shape control information, to pass through the controllable deforming material The deformation in portion accordingly adjusts the picture element density distribution of described image sensor.
  16. 16. image formation control method according to claim 15, it is characterised in that the controllable deforming material portion at least by with Lower one or more controllable deforming material is prepared:Piezoelectric, electroactive polymer, photo-deformable material, magnetostriction Material.
  17. 17. according to any image formation control methods of claim 1-3, it is characterised in that further include:
    Obtain the pixel index information of described image sensor;
    The described image obtained according to pixel index information scanning output.
  18. 18. according to any image formation control methods of claim 1-3, it is characterised in that further include:
    The described image obtained according to the picture element position information scanning output of described image sensor after adjustment.
  19. A kind of 19. imaging control apparatus, it is characterised in that including:
    One target depth of view information determining module, for determining the target depth of view information of scene to be taken the photograph;
    One object pixel Density Distribution information determination module, for determining that image object pixel is close according to the target depth of view information Spend distributed intelligence;
    One picture element density adjusts module, for adjusting the picture of imaging sensor according to described image object pixel Density Distribution information Plain Density Distribution;
    One image collection module, the image for scene to be taken the photograph described in the described image sensor shooting after adjusted.
  20. 20. imaging control apparatus according to claim 19, it is characterised in that the target depth of view information determining module, Including:
    One depth of field determines acquisition of information submodule, and information is determined for obtaining the depth of field;
    One first object depth of view information determination sub-module, for according to the depth of field determine information determine described in scene to be taken the photograph institute State target depth of view information.
  21. 21. imaging control apparatus according to claim 19, it is characterised in that the target depth of view information determining module, Including:
    One object point Depth Information Acquistion submodule, for obtaining the object point depth information of the scene to be taken the photograph;
    One second target depth of view information submodule, the mesh for scene to be taken the photograph described in being determined according to the object point depth information Mark depth of view information.
  22. 22. according to any imaging control apparatus of claim 19-21, it is characterised in that
    The target depth of view information determining module includes:One target depth of field determination sub-module, for determining the scene phase to be taken the photograph The target depth of field of focal plane;
    Described image object pixel Density Distribution information determination module includes:One first object picture element density distributed intelligence determines son Module, for determining described image object pixel Density Distribution information according to the target depth of field.
  23. 23. according to any imaging control apparatus of claim 19-21, it is characterised in that
    The target depth of view information determining module includes:One range information determination sub-module, for determining the scene to be taken the photograph Range information of at least part afocal object point to focal plane;
    Described image object pixel Density Distribution information determination module includes:One second object pixel Density Distribution information determines son Module, for determining described image object pixel Density Distribution information according to the range information.
  24. 24. imaging control apparatus according to claim 22, it is characterised in that further include:
    One focal plane determining module, for determining the focal plane of the scene to be taken the photograph.
  25. 25. imaging control apparatus according to claim 23, it is characterised in that further include:
    One focal plane determining module, for determining the focal plane of the scene to be taken the photograph.
  26. 26. imaging control apparatus according to claim 24, it is characterised in that the focal plane determining module includes:
    One region of interest determines acquisition of information submodule, and information is determined for obtaining region of interest;
    One first focal plane determination sub-module, for determining that information determines the focal plane according to the region of interest.
  27. 27. imaging control apparatus according to claim 25, it is characterised in that the focal plane determining module includes:
    One region of interest determines acquisition of information submodule, and information is determined for obtaining region of interest;
    One first focal plane determination sub-module, for determining that information determines the focal plane according to the region of interest.
  28. 28. imaging control apparatus according to claim 24, it is characterised in that the focal plane determining module includes:
    One graphical analysis submodule, for analyzing the preview image to the scene to be taken the photograph in described image sensor;
    One second focal plane determination sub-module, for determining the focal plane according to image analysis result.
  29. 29. imaging control apparatus according to claim 25, it is characterised in that the focal plane determining module includes:
    One graphical analysis submodule, for analyzing the preview image to the scene to be taken the photograph in described image sensor;
    One second focal plane determination sub-module, for determining the focal plane according to image analysis result.
  30. 30. according to any imaging control apparatus of claim 19-21, it is characterised in that
    The target depth of view information determining module includes:One target afocal fog-level information determination sub-module, for determining State the target afocal fog-level information of scene to be taken the photograph;
    Described image object pixel Density Distribution information determination module includes:One the 3rd object pixel Density Distribution information determines son Module, for determining described image object pixel Density Distribution information according to the target afocal fog-level information.
  31. 31. imaging control apparatus according to claim 30, it is characterised in that
    The target afocal fog-level information determination sub-module includes:One blur circle distributed intelligence determination unit, for determining At least part afocal object point of the scene to be taken the photograph is distributed letter in the blur circle of at least part imaging point of described image sensor Breath;
    The 3rd object pixel Density Distribution information determination sub-module includes:One object pixel Density Distribution information determines list Member, for determining that described image object pixel Density Distribution is believed according to the blur circle distributed intelligence of at least partly described imaging point Breath.
  32. 32. imaging control apparatus according to claim 31, it is characterised in that the blur circle distributed intelligence determination unit Including:
    One first blur circle information determination subelement, for determining an at least afocal object point for the scene to be taken the photograph in described image The blur circle information of an at least imaging point for sensor;
    One second blur circle information determination subelement, for other afocal object points of at least part of scene to be taken the photograph according to and Jiao The blur circle information of the distance of plane and at least one definite imaging point, determines the disperse of at least partly other imaging points Circle information;
    One blur circle distributed intelligence determination subelement, for being determined according to definite each blur circle information described at least part The blur circle distributed intelligence of imaging point.
  33. 33. according to any imaging control apparatus of claim 19-21, it is characterised in that the picture element density adjusts mould Block includes:
    One shape control information determination sub-module, for determining controllable deforming according to described image object pixel Density Distribution information The shape control information in material portion;
    One shape control module, for controlling the controllable deforming material portion to deform upon according to the shape control information, with The picture element density that described image sensor is accordingly adjusted by the deformation in the controllable deforming material portion is distributed.
  34. 34. according to any imaging control apparatus of claim 19-21, it is characterised in that further include:
    One pixel index data obtaining module, for obtaining the pixel index information of described image sensor;
    One first image output module, for the described image obtained according to pixel index information scanning output.
  35. 35. according to any imaging control apparatus of claim 19-21, it is characterised in that further include:
    One second image output module, for being obtained according to the picture element position information scanning output of described image sensor after adjustment Described image.
  36. 36. a kind of imaging device, it is characterised in that including an imaging sensor and any described just like claim 19-35 Imaging control apparatus, the imaging control apparatus are connected with described image sensor.
  37. 37. imaging device according to claim 36, it is characterised in that described image sensor includes:
    The multiple images sensor pixel of array distribution;
    One controllable deforming material portion, is connected with multiple described image sensor pixels respectively;The controllable deforming material portion is outside It can be deformed upon under field action and the Density Distribution of multiple described image sensor pixels is accordingly adjusted by the deformation;Institute Outfield is stated to be controlled by the imaging control apparatus.
CN201410426154.7A 2014-08-26 2014-08-26 The image formation control method and device and imaging device of shallow Deep Canvas image Active CN104159038B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410426154.7A CN104159038B (en) 2014-08-26 2014-08-26 The image formation control method and device and imaging device of shallow Deep Canvas image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410426154.7A CN104159038B (en) 2014-08-26 2014-08-26 The image formation control method and device and imaging device of shallow Deep Canvas image

Publications (2)

Publication Number Publication Date
CN104159038A CN104159038A (en) 2014-11-19
CN104159038B true CN104159038B (en) 2018-05-08

Family

ID=51884436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410426154.7A Active CN104159038B (en) 2014-08-26 2014-08-26 The image formation control method and device and imaging device of shallow Deep Canvas image

Country Status (1)

Country Link
CN (1) CN104159038B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104243823B (en) * 2014-09-15 2018-02-13 北京智谷技术服务有限公司 Optical field acquisition control method and device, optical field acquisition equipment
CN104469147B (en) * 2014-11-20 2018-09-04 北京智谷技术服务有限公司 Optical field acquisition control method and device, optical field acquisition equipment
CN106254750B (en) * 2015-08-31 2019-04-26 北京智谷睿拓技术服务有限公司 Image Acquisition control method and device
CN106604000A (en) * 2015-10-16 2017-04-26 惠州Tcl移动通信有限公司 Shooting device and shooting method
CN108692810B (en) * 2017-04-12 2021-01-29 北京小米移动软件有限公司 Light ray information acquisition method and device
CN107682597B (en) * 2017-09-01 2020-08-07 北京小米移动软件有限公司 Imaging method, imaging device and electronic equipment
CN107592455B (en) * 2017-09-12 2020-03-17 北京小米移动软件有限公司 Shallow depth of field effect imaging method and device and electronic equipment
CN107707813B (en) * 2017-09-12 2020-03-17 北京小米移动软件有限公司 Super-depth-of-field effect imaging method and device and electronic equipment
EP3496387A1 (en) * 2017-12-05 2019-06-12 Koninklijke Philips N.V. Apparatus and method of image capture
CN108419008B (en) * 2018-01-30 2020-04-28 努比亚技术有限公司 Shooting method, terminal and computer readable storage medium
CN110910769B (en) * 2019-11-29 2022-04-08 京东方科技集团股份有限公司 Virtual display device and preparation method and control method thereof
CN112887606B (en) * 2021-01-26 2023-04-07 维沃移动通信有限公司 Shooting method and device and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1792089A (en) * 2003-05-16 2006-06-21 瓦莱奥开关传感器有限公司 Digital camera device and method for producing the same
CN102201422A (en) * 2011-04-26 2011-09-28 格科微电子(上海)有限公司 Concave complementary metal-oxide-semiconductor (CMOS) image sensor and manufacturing method thereof

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006129411A (en) * 2004-11-01 2006-05-18 Hiroshi Arai Reconfigurable image sensor
US8040399B2 (en) * 2008-04-24 2011-10-18 Sony Corporation System and method for effectively optimizing zoom settings in a digital camera
US8654152B2 (en) * 2010-06-21 2014-02-18 Microsoft Corporation Compartmentalizing focus area within field of view

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1792089A (en) * 2003-05-16 2006-06-21 瓦莱奥开关传感器有限公司 Digital camera device and method for producing the same
CN102201422A (en) * 2011-04-26 2011-09-28 格科微电子(上海)有限公司 Concave complementary metal-oxide-semiconductor (CMOS) image sensor and manufacturing method thereof

Also Published As

Publication number Publication date
CN104159038A (en) 2014-11-19

Similar Documents

Publication Publication Date Title
CN104159038B (en) The image formation control method and device and imaging device of shallow Deep Canvas image
CN104320596B (en) The acquisition methods and acquisition device of super-resolution image
CN104159025B (en) IMAQ control method and device, image capture device
CN104243823B (en) Optical field acquisition control method and device, optical field acquisition equipment
CN104469147B (en) Optical field acquisition control method and device, optical field acquisition equipment
CN104363380B (en) IMAQ control method and device
CN105530423B (en) The acquisition methods and acquisition device of super-resolution image
CN104363381B (en) IMAQ control method and device
CN104301605B (en) Image formation control method and device, the imaging device of Digital Zoom image
CN105472233B (en) Optical field acquisition control method and device, optical field acquisition equipment
CN104464620B (en) Image display control method and device, display device
CN104506762B (en) Optical field acquisition control method and device, optical field acquisition equipment
CN102761706B (en) Imaging device and imaging method
CN101540844B (en) Composition determination device, and composition determination method
CN110730970B (en) Method and system for optimizing policy controller
CN106534675A (en) Method and terminal for microphotography background blurring
WO2006021767A2 (en) Method and apparatus for providing optimal images of a microscope specimen
Mann et al. Videoorbits on eye tap devices for deliberately diminished reality or altering the visual perception of rigid planar patches of a real world scene
CN107787463B (en) The capture of optimization focusing storehouse
CN113545030B (en) Method, user equipment and system for automatically generating full-focus image through mobile camera
CN105450924A (en) Super-resolution image obtaining method and device
CN108200335A (en) Photographic method, terminal and computer readable storage medium based on dual camera
CN106170064A (en) Camera focusing method, system and electronic equipment
CN104486555B (en) Image Acquisition control method and device
CN104537976B (en) Time-division display control method and device, display equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant