CN104092955B - Flash control method and control device, image-pickup method and collecting device - Google Patents
Flash control method and control device, image-pickup method and collecting device Download PDFInfo
- Publication number
- CN104092955B CN104092955B CN201410373832.8A CN201410373832A CN104092955B CN 104092955 B CN104092955 B CN 104092955B CN 201410373832 A CN201410373832 A CN 201410373832A CN 104092955 B CN104092955 B CN 104092955B
- Authority
- CN
- China
- Prior art keywords
- flash
- depth
- light
- focal length
- projection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Abstract
The technical solution of the embodiment of the present application discloses a kind of flash control method and device and corresponding image-pickup method and equipment, wherein, the flash control method includes:Obtain the Depth profile information of an at least subject in scene to be captured relative to a shooting reference position;Multiple projection focal lengths are determined according to the Depth profile information;Determined and the multiple projection focal length multiple flashing patterns correspondingly according at least to the Depth profile information.The embodiment of the embodiment of the present application is according to the Depth profile information of subject in scene to be captured, determine the projection focal length and flashing pattern for glistening to the subject of different depth scope, so that when being shot to the scene to be captured, suitable light filling can be carried out to the subject of different depth scope in the scene to be captured by the projection focal length and flashing pattern, and then collect the image of the good scene to be captured of exposure effect.
Description
Technical field
This application involves image acquisition technology field, more particularly to a kind of flash control method and control device, image to adopt
Diversity method and collecting device.
Background technology
Under conditions of ambient light light is bad, when being particularly night, carrying out the shooting of photo needs using flash lamp
Light filling is carried out to scene, the light that is sent by flash lamp illuminates scene the shooting while, obtains more preferable photographic effect.
Some flash lamps are directly installed on camera, such as can generally have built-in flash module on mobile phone, family expenses camera;
There are some more professional phase chances to use Supported Speedlights, to carry out more preferable light filling to scene.
The content of the invention
A kind of possible purpose of the application is:A kind of flash of light control technology scheme and corresponding image acquisition technology are provided
Scheme.
In a first aspect, the possible embodiment of the application provides a kind of flash control method, including:
Obtain the Depth profile information of an at least subject in scene to be captured relative to a shooting reference position;
Multiple projection focal lengths are determined according to the Depth profile information;
Determined and the multiple projection focal length multiple flashing patterns correspondingly according at least to the Depth profile information.
Second aspect, the possible embodiment of the application provide a kind of flash of light control device, including:
Acquisition of information submodule, reference bit is shot for obtaining an at least subject in scene to be captured relative to one
The Depth profile information put;
Projection focal length determination sub-module, for determining multiple projection focal lengths according to the Depth profile information;
Flashing pattern determination sub-module, for being determined and the multiple projection focal length according at least to the Depth profile information
One-to-one multiple flashing patterns.
The third aspect, the possible embodiment of the application provide a kind of image-pickup method, including:
Obtain multiple projection focal lengths and the corresponding multiple flashing patterns of the multiple projection focal length in a scene to be captured;
In response to a shooting instruction, multiple projections flash of light is carried out to the scene to be captured, and to the scene to be captured
Carry out repeatedly shooting and obtain multiple initial pictures, wherein, each projection flash in the multiple projections flash of light corresponds to described
It is corresponding with an at least projection focal length in an at least projection focal length and the multiple flashing pattern in multiple projection focal lengths
An at least flashing pattern;Each shooting in the repeatedly shooting is corresponding with each projection flash in the repeatedly flash of light;
Synthesize the multiple initial pictures.
Fourth aspect, the possible embodiment of the application provide a kind of image capture device, including:
Flash of light parameter acquisition module, for obtaining multiple projection focal lengths and the multiple projection focal length in a scene to be captured
Corresponding multiple flashing patterns;
Flash module, in response to a shooting instruction, multiple projections flash of light to be carried out to the scene to be captured;Wherein,
Each projection flash in multiple projections flash of light correspond to an at least projection focal length in the multiple projection focal length and
An at least flashing pattern corresponding with an at least projection focal length in the multiple flashing pattern;
Image capture module, for responding the shooting instruction, obtains the multiple shooting of scene progress to be captured more
A initial pictures;Wherein, each shooting in the repeatedly shooting is corresponding with each projection flash in the repeatedly flash of light;
Processing module, for synthesizing the multiple initial pictures.
At least one embodiment of the embodiment of the present application is believed according to the depth profile of subject in scene to be captured
Breath, determines the projection focal length and flashing pattern for glistening to the subject of different depth scope, so that
When being shot to the scene to be captured, by the projection focal length and flashing pattern can in the scene to be captured not
Subject with depth bounds carries out suitable light filling, and then collects the image of the good scene to be captured of exposure effect.
Brief description of the drawings
Fig. 1 is a kind of flow diagram of flash control method of the embodiment of the present application;
Fig. 2 is a kind of application scenarios schematic diagram of flash control method of the embodiment of the present application;
Fig. 3 is a kind of corresponding depth map schematic diagram of application scenarios of flash control method of the embodiment of the present application;
Fig. 4 a-4c are a kind of the corresponding with the application scenarios shown in Fig. 2 of flash control method acquisition of the embodiment of the present application
Three flashing patterns schematic diagram;
Fig. 5 is a kind of structural schematic block diagram of flash of light control device of the embodiment of the present application;
Fig. 6 a are the structural schematic block diagram of another flash of light control device of the embodiment of the present application;
Fig. 6 b and 6c are respectively that the structure of the acquisition of information submodule of two kinds of flash of light control devices of the embodiment of the present application is shown
Meaning block diagram;
Fig. 6 d are a kind of structural representation frame of the flashing pattern determination sub-module of flash of light control device of the embodiment of the present application
Figure;
Fig. 6 e are the structural schematic block diagram of another flash of light control device of the embodiment of the present application;
Fig. 7 is the structural schematic block diagram of another flash of light control device of the embodiment of the present application;
Fig. 8 is a kind of flow diagram of image-pickup method of the embodiment of the present application;
Fig. 9 a-9c are the schematic diagram for three initial pictures that a kind of image-pickup method of the embodiment of the present application obtains;
Fig. 9 d are the schematic diagram of image after the synthesis that a kind of image-pickup method of the embodiment of the present application obtains;
Figure 10 is a kind of structural schematic block diagram of image capture device of the embodiment of the present application;
Figure 11 a are the structural schematic block diagram of another image capture device of the embodiment of the present application;
Figure 11 b are the structural schematic block diagram of another image capture device of the embodiment of the present application;
Figure 11 c are a kind of structural representation frame of the flash of light parameter acquisition module of image capture device of the embodiment of the present application
Figure;
Figure 11 d are a kind of structural schematic block diagram of the second determination sub-module of image capture device of the embodiment of the present application.
Embodiment
(identical label represents identical element in some attached drawings) and embodiment below in conjunction with the accompanying drawings, to the tool of the application
Body embodiment is described in further detail.Following embodiments are used to illustrate the application, but are not limited to scope of the present application.
It will be understood by those skilled in the art that the term such as " first ", " second " in the application be only used for distinguishing it is asynchronous
Suddenly, equipment or module etc., neither represent any particular technology implication, also do not indicate that the inevitable logical order between them.
Present inventor has found, includes in scene to be captured and is taken pair apart from different multiple of camera site depth
As when, be difficult often to obtain suitable flash effect, such as:When surveying luminous point away from the camera site, being taken nearby
Object can receive excessive flash of light and the situation of overexposure occurs;When surveying luminous point close to the camera site, distant place
Subject can be because not enough there is under exposed situation in flash of light.For such case, as shown in Figure 1, the embodiment of the present application
A kind of possible embodiment provides a kind of flash control method, including:
S110 obtains an at least subject in scene to be captured to be believed relative to the depth profile of a shooting reference position
Breath;
S120 determines multiple projection focal lengths according to the Depth profile information;
S130 is determined and the multiple flashes of light correspondingly of the multiple projection focal length according at least to the Depth profile information
Pattern.
For example, executive agent of the flash of light control device provided by the invention as the present embodiment, execution S110~
S130.Specifically, the flash of light control device can be arranged on user equipment in a manner of software, hardware or software and hardware combining
In;The user equipment includes but not limited to:Camera, video camera, the mobile phone with Image Acquisition or dynamic menu shooting function,
Intelligent glasses etc..
The technical solution of the embodiment of the present application determines to use according to the Depth profile information of subject in scene to be captured
The projection focal length and flashing pattern glistened in the subject to different depth scope, so that waiting to clap to described
, can be to different depth scope in the scene to be captured by the projection focal length and flashing pattern when taking the photograph scene and being shot
Subject carry out suitable light filling, and then collect the image of the good scene to be captured of exposure effect.
Each step of the embodiment of the present application is further detailed by following embodiment:
S110 obtains an at least subject in scene to be captured to be believed relative to the depth profile of a shooting reference position
Breath.
In the embodiment of the present application, the shooting reference position is the position relatively-stationary one with an image capture device
Position, can be arranged as required to.For example, in a kind of possible embodiment of the embodiment of the present application, the shooting reference bit
Put imaging surface or camera lens position that can be for described image collecting device;It is described in alternatively possible embodiment
It for example can be the position where Depth profile information acquisition module to shoot reference position;Alternatively, in another possible implementation
In mode, the shooting reference position for example can be the position where flash module.
In the embodiment of the present application, generally comprise that depth value span is larger at least one to be taken in the scene to be captured
Object.For example, in a kind of possible embodiment, at least subject in the scene to be captured includes target pair
As (destination object described here is user's object interested), the background object positioned at the destination object rear and
Foreground object in front of the destination object.Certainly, can in the scene to be captured in a kind of possible embodiment
The destination object only can be included, such as shoot the local time of a destination object.
In a kind of possible embodiment of the embodiment of the present application, the Depth profile information can be a depth map.Example
Such as, can be the depth map of the scene to be captured obtained in the shooting reference position, it includes the scene to be captured
Range information of the middle each point to the shooting reference position.
The embodiment of the present application obtain the Depth profile information mode can include it is a variety of, such as:
In a kind of possible embodiment, the Depth profile information can be obtained by information gathering.
, can be by a depth transducer of the flash of light control device to obtain in a kind of possible embodiment
State Depth profile information.The depth transducer for example can be:Infrared distance sensor, ultrasonic distance sensor or vertical
Body camera distance sensor etc..
In alternatively possible embodiment, the Depth profile information can also be obtained from an at least external equipment.
For example, in a kind of possible embodiment, the flash of light control device does not have the depth transducer, and other users set
It is standby, such as the intelligent glasses of user have the depth transducer, at this point it is possible to described in obtaining from other user equipmenies
Depth information.In the present embodiment, the flash of light control device can be carried out by a communication device and the external equipment
Communicate to obtain the depth information.
S120 determines multiple projection focal lengths according to the Depth profile information.
In the embodiment of the present application, the projection focal length is the imaging surface of the image of flash module projection to projected position
Distance.
In a kind of possible embodiment, the step S120 includes:
Multiple depths of the scene to be captured relative to the shooting reference position are determined according to the Depth profile information
Spend scope;
The multiple projection focal length is determined according to the multiple depth bounds.
In the present embodiment, can be obtained at least one described in the scene to be captured according to the Depth profile information
Distribution situation of the subject in depth, the multiple depth model can be targetedly determined according to the distribution situation
Enclose, specifically may refer to corresponding description in FIG. 2 below illustrated embodiment.
In the present embodiment, can be by described in when determining the multiple projection focal length according to the multiple depth bounds
The depth-averaged value of a depth bounds is as projection focal length corresponding with the depth bounds in multiple depth boundses.Certainly, ability
The technical staff in domain can also use other possible it is recognised that in the other possible embodiments of the embodiment of the present application
Mode determines projection focal length corresponding with a depth bounds, such as:At most it is taken pair using having in the depth bounds
The corresponding depth value in the face that is taken of elephant is as the projection focal length.
In the other possible embodiments of the embodiment of the present application, the step S120 can also not have to wait to clap described in consideration
The distribution of subject in scene is taken the photograph, the depth information of the scene to be captured is obtained according to the Depth profile information, directly
Connect and multiple projection focal lengths are determined according to the depth information.For example, a scene to be captured is obtained according to the Depth profile information
Depth be 5 meters, multiple projection focal lengths difference are determined according to one definite regular (such as depth is bisected into N part) set in advance
For:1 meter, 2 meters ... 5 meters.
S130 is determined and the multiple flashes of light correspondingly of the multiple projection focal length according at least to the Depth profile information
Pattern.
In the possible embodiment of the embodiment of the present application, the step S130 includes:
The shape of the multiple flashing pattern is determined according to the Depth profile information and the multiple depth bounds.
In the present embodiment, since each flashing pattern is for being mended to the subject of corresponding depth bounds
Light, therefore, the shape of the multiple flashing pattern can be determined according to the Depth profile information and the multiple depth bounds
Shape.For example, having a spherical subject in a depth bounds, then the corresponding flashing pattern of the depth bounds can include and the ball
The corresponding border circular areas of shape subject.
In a kind of possible embodiment of the embodiment of the present application, the step S130 is except that will determine the flashing pattern
The shape outside, further include:
The pattern intensity or light transmittance of the multiple flashing pattern are determined according to the multiple depth bounds.
In a kind of possible embodiment of the embodiment of the present application, the flashing pattern can be the flash of light of a flash module
The pattern that light source is directly formed.The flash module for example can be a display projection module at this time, for projecting the flash of light
Pattern.
In the embodiment of the present application, except corresponding with subject described in corresponding depth bounds in the flashing pattern
Region outside, other regions can be black, i.e. when carrying out light filling to the object of a depth bounds according to the flashing pattern,
The object of other depth boundses will not obtain the light filling of flash module substantially.
In the embodiment of the present application, the light intensity that the corresponding point of a depth value can be reached according to flash of light meets the benefit of a setting
Light standard obtains the pattern intensity of flashing pattern corresponding with the point.In a kind of possible embodiment, can according to for
The mapping table of one depth value and brightness obtains the pattern intensity of corresponding flashing pattern.In a kind of possible embodiment, it is
The acquisition efficiency of flashing pattern is improved, the pattern intensity of the corresponding flashing pattern of a depth bounds can be identical, such as root
It is used as the depth value of the corresponding flashing pattern of the depth bounds according to the projection focal length of the depth bounds to determine the flashlight view
The pattern intensity of case.
In general, the object more remote from the shooting reference position needs to be illuminated by the flash of light of light intensity bigger, therefore one
As for the corresponding depth bounds of the flashing pattern depth value it is bigger, the brightness of the flashing pattern is higher.
In the alternatively possible embodiment of the embodiment of the present application, the flashing pattern is to be arranged on flash module
The pattern of a mask plate before light source, formation is corresponding with the transmission region of the flashing pattern after light source passes through the flashing pattern
Pattern.
When the flashing pattern is the pattern of the mask plate, in a kind of possible embodiment, in order to enable one
The object to be captured of depth bounds obtains suitable light filling, and corresponding flashing pattern is determined according to the depth value of the depth bounds
Light transmittance.For example, the light transmittance of the bigger corresponding flashing pattern of depth bounds of depth value is higher.
In alternatively possible embodiment, the light transmittance of the flashing pattern is constant, for example, with corresponding depth
The corresponding region of subject is full impregnated light in scope, and other regions are complete light tight.At this point it is possible to by adjusting light source
Intensity obtains the suitable light filling of the depth bounds.Therefore, the step S130 can also include:According to the multiple depth model
Enclose definite the multiple flashing pattern multiple flash of light light intensity correspondingly.
Certainly, one skilled in the art will appreciate that in the other possible embodiments of the embodiment of the present application, the sudden strain of a muscle is determined
During light pattern, as needed it is also conceivable to the other factors such as ambient light, color of subject.
Each step of the embodiment of the present application is further illustrated by following application scenarios:
For example, as shown in Fig. 2, in a kind of possible embodiment, it is taken in the scene to be captured comprising three
Object, the Depth profile information of three subjects are respectively:First object 211 is a personage, its opposite shooting is joined
Examining position 220 has 2 meters or so of depth d1;Second object 212 is a landscape, corresponding to 3 meters or so of depth d2;3rd object
For a city wall background 213, corresponding to 4 meters or so of depth d3.
In the embodiment of the present application, the scene to be captured is obtained by a depth transducer to refer to relative to the shooting
The depth map 300 of position 220, as shown in Figure 3.In described Fig. 3, first object 211 corresponds to first area 311, institute
State the second object 212 and correspond to second area 312, the 3rd object 213 corresponds to the 3rd region 313, in figure 3, different
The shadow representation of type to it is described shooting reference position 220 different distance (in the embodiment of the present application, in order to simplify represent,
Tiny depth difference is omitted, for example, than extremity portion closer to the shooting reference bit on front side of the body of the first object 211
220 are put, but is expressed as same depth value in figure 3).
According to the Depth profile information of three subjects in the scene to be captured represented in the depth map 300
The scene to be captured can be divided into three depth boundses, such as:First depth bounds:1.8 meters~2.2 meters, the second depth
Scope:2.8 meters~3.2 meters, the 3rd depth bounds:3.8 meters~4.2 meters.
In the present embodiment, such as:Take three depth boundses average depth value can determine respectively with this three
Corresponding three projection focal lengths of depth bounds:First projection focal length:2 meters, the second projection focal length:3 meters, the 3rd projection focal length:4
Rice.
According to the above three area of subject corresponding with three depth boundses respectively in the depth map 300
Domain can obtain flashing pattern corresponding with each depth bounds, as shown in fig. 4 a, corresponding with first depth bounds
One flashing pattern 411;As shown in Figure 4 b, the second flashing pattern 412 corresponding with second depth bounds;As illustrated in fig. 4 c,
The 3rd flashing pattern 413 corresponding with the 3rd depth bounds.
In the embodiment of the present application, three flashing patterns are all full impregnated light, according to three depth boundses, are determined
Flash intensity corresponding with each flashing pattern respectively, wherein, it is strong with 411 corresponding first flash of light light intensity of the first flashing pattern
Spend minimum, the intensity of second flash of light light intensity corresponding with the second flashing pattern 412 is placed in the middle, corresponding with the 3rd flashing pattern 413
The intensity highest of 3rd flash of light light intensity.
In the above-described embodiment, the flash of light control device does not include flash module, is merely creating the flashlight view
The flashing pattern, then can be supplied to a flash module by case.
In the alternatively possible embodiment of the embodiment of the present application, the flash of light control device can also include described
Flash module, at this time, the method further include:
In response to a shooting instruction, multiple projections flash of light is carried out to the scene to be captured, wherein, the multiple projections are dodged
Each projection flash in light corresponds to:
An at least projection focal length in the multiple projection focal length, and
An at least flashing pattern corresponding with an at least projection focal length in the multiple flashing pattern.
In a kind of possible embodiment of the embodiment of the present application, the shooting instruction can be moved according to the operation of user
Make the instruction produced, for example, voice command of action, shooting of shutter etc. is pressed according to a user produces the shooting instruction;
In alternatively possible embodiment, the shooting instruction can also be the satisfaction according to some pre-set shooting conditions
And produce, such as:Under one monitoring scene, preset every 5 minutes and clap a film,fault;Alternatively, there is the object of movement into fashionable
Shoot photo.
In a kind of possible embodiment of the embodiment of the present application, one time projection flash can correspond only to a projection focal length
A flashing pattern corresponding with the projection focal length.
In the alternatively possible embodiment of the embodiment of the present application, one time projection flash can correspond to the multiple throwing
Shadow focal length and multiple flashing patterns corresponding with the multiple projection focal length.In the present embodiment, the flash module may
Including multiple flash of light submodules that can independently glisten and adjust projection focal length, respectively to different depth in a projection flash
The subject of scope carries out light filling.
In alternatively possible embodiment, the light filling of the subject of different depth scope is also needed to consider
During the flash of light light intensity of flash module, the projection flash every time is also corresponding with being dodged in the multiple flash of light light intensity with described at least one
The corresponding at least one flash of light light intensity of light pattern.Likewise, a projection flash can correspond to a flash of light light intensity, also may be used
With corresponding to multiple flash of light light intensity.
In the present embodiment, the projection flash is time very short light filling, and certainly, those skilled in the art can be with
Know, in a kind of possible embodiment, time of projection flash light filling illumination is always according to needing to set.Such as at this
Application embodiment technical solution apply shoot dynamic menu scene in when, can according to the needs of photographed scene time come
The lighting hours of the projection flash is set.
It is to be illustrated by taking Fig. 2-Fig. 4 c illustrated embodiments as an example further below:
In a kind of possible embodiment, in response to a shooting instruction, the scene to be captured is projected three times
Flash of light, wherein:
First time projection flash corresponds to:First projection focal length, the first flashing pattern 411 and the first flash of light light intensity, its
For meeting that the light intensity of the light filling standard of setting carries out light filling to the first object 211 in the first depth bounds;
Second of projection flash corresponds to:Second projection focal length, the second flashing pattern 412 and the second flash of light light intensity, its
The light intensity of light filling standard for meeting the setting in the second depth bounds carries out light filling to the second object 212;
Third time projection flash corresponds to:3rd projection focal length, the 3rd flashing pattern 413 and the 3rd flash of light light intensity, its
For meeting that the light intensity of the light filling standard of the setting carries out light filling to the 3rd object 213 in the 3rd depth bounds.
In the case of one kind is possible, a flash module can include three flash of light submodules, therefore, alternatively possible
Embodiment in, in response to a shooting instruction, a projection flash is carried out to the scene to be captured, in this projection flash
In, first object 211, the second object 212 and the 3rd object 213 can be mended with parameter recited above at the same time
Light.When carrying out the corresponding shooting with the projection flash of present embodiment, the image of needs can be directly obtained.
Thus, it will be seen that by the technical solution of the embodiment of the present application, it is different that depth in photographed scene can be treated
Subject carries out suitable light filling.
It will be understood by those skilled in the art that in the above method of the application embodiment, the sequence number of each step
Size is not meant to the priority of execution sequence, and the execution sequence of each step should be determined with its function and internal logic, without answering
Any restriction is formed to the implementation process of the application embodiment.
As shown in figure 5, a kind of possible embodiment of the embodiment of the present application provides a kind of flash of light control device 500,
Including:
Acquisition of information submodule 510, joins for obtaining an at least subject in scene to be captured relative to a shooting
Examine the Depth profile information of position;
Projection focal length determination sub-module 520, for determining multiple projection focal lengths according to the Depth profile information;
Flashing pattern determination sub-module 530, for being determined and the multiple projection according at least to the Depth profile information
Focal length multiple flashing patterns correspondingly.
The technical solution of the embodiment of the present application determines to use according to the Depth profile information of subject in scene to be captured
The projection focal length and flashing pattern glistened in the subject to different depth scope, so that waiting to clap to described
, can be to different depth scope in the scene to be captured by the projection focal length and flashing pattern when taking the photograph scene and being shot
Subject carry out suitable light filling, and then collect the image of the good scene to be captured of exposure effect.
Each module of the embodiment of the present application is further detailed below:
As shown in Figure 6 a, in a kind of possible embodiment of the embodiment of the present application, the Depth profile information can be
One depth map;
At this time, described information acquisition submodule 510 includes:
Depth map acquiring unit 511, for obtaining the depth map.
The distance that the depth map can for example include each point to the shooting reference position in the scene to be captured is believed
Breath.
As shown in Figure 6 b, in a kind of possible embodiment of the embodiment of the present application, described information acquisition submodule 510 is wrapped
Include:
Depth profile sensing unit 512, for gathering the Depth profile information.
The depth profile sensing unit 512 for example can be infrared distance sensor, ultrasonic distance sensor or
Stereo camera shooting range sensor etc., for gathering the range information of each point in for example described image to be captured.
As fig. 6 c, in the alternatively possible embodiment of the embodiment of the present application, described information acquisition submodule 510
Including:
Communication unit 513, for obtaining the Depth profile information from an at least external equipment.
For example, the communication unit 513 is obtained by the communication interaction with other user equipmenies such as a mobile phone or intelligent glasses
Obtain the Depth profile information.
Alternatively, as described in Fig. 6 a, in a kind of possible embodiment of the embodiment of the present application, the projection focal length determines
Submodule 520 includes:
Depth bounds determination unit 521, for according to the Depth profile information determine the scene to be captured relative to
Multiple depth boundses of the shooting reference position;
Projection focal length determination unit 522, for determining the multiple projection focal length according to the multiple depth bounds.
In the present embodiment, can be obtained at least one described in the scene to be captured according to the Depth profile information
Distribution situation of the subject in depth, the multiple depth model can be targetedly determined according to the distribution situation
Enclose.
In the present embodiment, can be by described in when determining the multiple projection focal length according to the multiple depth bounds
The depth-averaged value of a depth bounds is as projection focal length corresponding with the depth bounds in multiple depth boundses.Certainly, ability
The technical staff in domain can also use other possible it is recognised that in the other possible embodiments of the embodiment of the present application
Mode determines projection focal length corresponding with a depth bounds, such as:At most it is taken pair using having in the depth bounds
The corresponding depth value in the face that is taken of elephant is as the projection focal length.
In the other possible embodiments of the embodiment of the present application, the projection focal length determination sub-module 520 can not also
With the distribution for considering subject in the scene to be captured, the scene to be captured is obtained according to the Depth profile information
Depth information, multiple projection focal lengths are directly determined according to the depth information.Referring specifically to corresponding in above method embodiment
Description.
Alternatively, as shown in Figure 6 a, in a kind of possible embodiment of the embodiment of the present application, the flashing pattern determines
Submodule 530 includes:
Pattern form determination unit 531, for determining institute according to the Depth profile information and the multiple depth bounds
State the shape of multiple flashing patterns.
In the present embodiment, since each flashing pattern is for being mended to the subject of corresponding depth bounds
Light, therefore, the shape of the multiple flashing pattern can be determined according to the Depth profile information and the multiple depth bounds
Shape.
In a kind of possible embodiment of the embodiment of the present application, the flashing pattern can be the flash of light of a flash module
The pattern that light source is directly formed.The flash module for example can be a display projection module at this time, for projecting the flash of light
Pattern.
At this time, as described in Fig. 6 a, in a kind of possible embodiment of embodiment is applied for, the flashing pattern determines submodule
Block 530 further includes:
Pattern intensity determination unit 532, for determining the figure of the multiple flashing pattern according to the multiple depth bounds
Case brightness.
In the embodiment of the present application, the pattern intensity determination unit 532 can reach a depth value according to flash of light and correspond to
The light intensity of point meet that the light filling standard of a setting obtains the pattern intensity of flashing pattern corresponding with the point.A kind of possible
Embodiment in, the pattern intensity determination unit 532 can obtain pair according to the mapping table for a depth value and brightness
Answer the pattern intensity of flashing pattern.In a kind of possible embodiment, in order to improve the acquisition efficiency of flashing pattern, a depth
The pattern intensity of the corresponding flashing pattern of scope can be identical, such as is used as the depth according to the projection focal length of the depth bounds
The depth value of the corresponding flashing pattern of degree scope determines the pattern intensity of the flashing pattern.
In the embodiment of the present application, except corresponding with subject described in corresponding depth bounds in the flashing pattern
Region outside, other regions can be black, i.e. when carrying out light filling to the object of a depth bounds according to the flashing pattern,
The object of other depth boundses will not obtain the light filling of flash module substantially.
In the alternatively possible embodiment of the embodiment of the present application, the flashing pattern is to be arranged on flash module
The pattern of a mask plate before light source, formation is corresponding with the transmission region of the flashing pattern after light source passes through the flashing pattern
Pattern.
When the flashing pattern is the pattern of the mask plate, in a kind of possible embodiment, in order to enable one
The object to be captured of depth bounds obtains suitable light filling, and the flashing pattern determination sub-module 532 can be according to the depth
The depth value of scope determines the light transmittance of corresponding flashing pattern.For example, the corresponding flashing pattern of depth bounds that depth value is bigger
Light transmittance it is higher.
As shown in fig 6d, in a kind of possible embodiment of the embodiment of the present application, the flashing pattern determination sub-module
532 further include:
Pattern light transmittance determination unit 533, for determining the multiple flashing pattern according to the multiple depth bounds
Pattern light transmittance.
In alternatively possible embodiment, the light transmittance of the flashing pattern is constant, for example, with corresponding depth
The corresponding region of subject is full impregnated light in scope, and other regions are complete light tight.At this point it is possible to by adjusting light source
Intensity obtains the suitable light filling of the depth bounds.As shown in fig 6e, in a kind of possible embodiment of the embodiment of the present application,
Described device 500 further includes:
Light intensity of glistening determination sub-module 540, for determining the multiple flashing pattern one according to the multiple depth bounds
One corresponding multiple flash of light light intensity.
Certainly, one skilled in the art will appreciate that in the other possible embodiments of the embodiment of the present application, the sudden strain of a muscle is determined
During light pattern, as needed it is also conceivable to the other factors such as ambient light, color of subject.
In the above-described embodiment, the flash of light control device does not include flash module, is merely creating the flashlight view
The flashing pattern, then can be supplied to an external flashing module by case.
It is as shown in Figure 6 a, a kind of in the embodiment of the present application in the alternatively possible embodiment of the embodiment of the present application
In possible embodiment, described device 500 further includes:
First flash of light submodule 550, in response to a shooting instruction, multiple projections sudden strain of a muscle to be carried out to the scene to be captured
Light, wherein, each projection flash in the multiple projections flash of light corresponds to:
An at least projection focal length in the multiple projection focal length, and
An at least flashing pattern corresponding with an at least projection focal length in the multiple flashing pattern.
In alternatively possible embodiment, as shown in fig 6e, to the light filling of the subject of different depth scope
When also needing to the flash of light light intensity in view of flash module, the projection flash every time it is also corresponding with the multiple flash of light light intensity with
The corresponding at least one flash of light light intensity of at least flashing pattern.In this embodiment, described device 500 further includes:
Second flash of light submodule 560, in response to a shooting instruction, multiple projections sudden strain of a muscle to be carried out to the scene to be captured
Light, wherein, each projection flash in the multiple projections flash of light corresponds to:
An at least projection focal length in the multiple projection focal length,
An at least flashing pattern corresponding with an at least projection focal length in the multiple flashing pattern, and
At least one flash of light light intensity corresponding with an at least flashing pattern in the multiple flash of light light intensity.
One application scenarios of the embodiment of the present application device 500 are referring to implementation shown in Fig. 2-Fig. 4 c in above method embodiment
Corresponding description in example, which is not described herein again.
Fig. 7 is the structure diagram of another flash of light control device 600 provided by the embodiments of the present application, and the application is specifically real
Specific implementation of the example not to the control device 600 that glistens is applied to limit.As shown in fig. 7, the flash of light control device 600 can wrap
Include:
Processor (processor) 610, communication interface (Communications Interface) 620, memory
(memory) 630 and communication bus 640.Wherein:
Processor 610, communication interface 620 and memory 630 complete mutual communication by communication bus 640.
Communication interface 620, for communicating with the network element of such as client etc..
Processor 610, for executive program 632, can specifically perform the correlation step in above method embodiment.
Specifically, program 632 can include program code, and said program code includes computer-managed instruction.
Processor 610 is probably a central processor CPU, or specific integrated circuit ASIC (Application
Specific Integrated Circuit), or be arranged to implement the embodiment of the present application one or more integrate electricity
Road.
Memory 630, for storing program 632.Memory 630 may include high-speed RAM memory, it is also possible to further include
Nonvolatile memory (non-volatile memory), for example, at least a magnetic disk storage.Program 632 can specifically be used
Following steps are performed in causing the flash of light control device 600:
Obtain the Depth profile information of an at least subject in scene to be captured relative to a shooting reference position;
Multiple projection focal lengths are determined according to the Depth profile information;
Determined and the multiple projection focal length multiple flashing patterns correspondingly according at least to the Depth profile information.
The specific implementation of each step may refer to corresponding in corresponding steps and the unit in above-described embodiment in program 632
Description, this will not be repeated here.It is apparent to those skilled in the art that for convenience and simplicity of description, it is above-mentioned to retouch
The equipment and the specific work process of module stated, may be referred to the corresponding process description in preceding method embodiment, herein no longer
Repeat.
As shown in figure 8, the embodiment of the present application is a kind of possible embodiment further provides a kind of image-pickup method, wrap
Include:
S810 obtains multiple projection focal lengths and the corresponding multiple flashlight views of the multiple projection focal length in a scene to be captured
Case;
S820 is in response to a shooting instruction, to the scene progress multiple projections flash of light to be captured, and to described to be captured
Scene carries out repeatedly shooting and obtains multiple initial pictures, wherein, each projection flash in the multiple projections flash of light corresponds to
In an at least projection focal length and the multiple flashing pattern in the multiple projection focal length with least one projection focal length
A corresponding at least flashing pattern;Each shooting in the repeatedly shooting and each projection flash pair in the repeatedly flash of light
Should;
S830 synthesizes the multiple initial pictures.
For example, executive agent of the image capture device provided by the invention as the present embodiment, execution S810~
S830.Specifically, the image capture device includes but not limited to:Camera, video camera, have Image Acquisition or dynamic menu
The mobile phone of shooting function, intelligent glasses etc..
The technical solution of the embodiment of the present application determines to use according to the Depth profile information of subject in scene to be captured
The projection focal length and flashing pattern glistened in the subject to different depth scope so as to the field to be captured
, can be to the quilt of different depth scope in the scene to be captured by the projection focal length and flashing pattern when scape is shot
Reference object carries out suitable light filling, and then collects the image of the good scene to be captured of exposure effect.
Each step of the embodiment of the present application is further detailed by following embodiment:
S810 obtains multiple projection focal lengths and the corresponding multiple flashlight views of the multiple projection focal length in a scene to be captured
Case.
In the embodiment of the present application, the step S810 obtains the multiple projection focal length and the multiple flashing pattern
Mode can have it is a variety of, such as:
In a kind of possible embodiment, the step S810 includes:
The multiple projection focal length and the multiple flashing pattern are obtained from an at least external equipment.
In a kind of possible embodiment, described image collecting device can be a digital camera, another use of user
Family equipment, such as mobile phone or intelligent glasses obtain the depth profile of current scene to be captured by the depth transducer that itself is equipped with
Information, and the multiple projection focal length and the multiple flashing pattern, described image are obtained according to the Depth profile information and adopted
Acquisition means obtain the multiple projection focal length and the multiple flashing pattern by the communication with the external equipment.
In alternatively possible embodiment, the step S810 obtains the multiple projection focal length and corresponding described
The mode of multiple flashing patterns is with obtaining the multiple projection focal length and corresponding in the flash control method of embodiment illustrated in fig. 1
The mode of the multiple flashing pattern is identical, including:
An at least subject in the scene to be captured is obtained relative to the depth profile of a shooting reference position to believe
Breath;
The multiple projection focal length is determined according to the Depth profile information;
Determined and the multiple flash of light correspondingly of the multiple projection focal length according at least to the Depth profile information
Pattern.
Present embodiment obtains the further describing referring to Fig. 1-figure of the multiple projection focal length and multiple flashing patterns
Corresponding description in 4c illustrated embodiments, which is not described herein again.
S820 is in response to a shooting instruction, to the scene progress multiple projections flash of light to be captured, and to described to be captured
Scene carries out repeatedly shooting and obtains multiple initial pictures.
In a kind of possible embodiment of the embodiment of the present application, the shooting instruction can be moved according to the operation of user
Make the instruction produced, for example, voice command of action, shooting of shutter etc. is pressed according to a user produces the shooting instruction;
In alternatively possible embodiment, the shooting instruction can also be the satisfaction according to some pre-set shooting conditions
And produce.
In a kind of possible embodiment, described in embodiment as shown in Figure 1, the step S820 waits to clap to described
Take the photograph scene and carry out multiple projections flash of light, wherein each projection flash corresponds to an at least depth bounds, with an at least depth model
Enclose a corresponding at least projection focal length and an at least flashing pattern carries out projection flash.
Wherein, it is deep with one when each depth bounds also corresponds to a flash of light light intensity in a kind of possible embodiment
Spend the corresponding projection flash of scope and the projection flash is also carried out with flash of light light intensity corresponding with the depth bounds.
Wherein, in the present embodiment, the parameter shot every time can be identical.Certainly, the embodiment of the present application its
In its possible embodiment, adjusted according to the needs of user's shooting effect or according to the multiple projection focal length
Whole, such as:The shooting focal length shot every time matches with the projection focal length of corresponding projection flash.
S830 synthesizes the multiple initial pictures.
In a kind of possible embodiment of the embodiment of the present application, the step S830 includes:
Determine an at least image region for each initial pictures in the multiple initial pictures;
The multiple initial pictures are synthesized according to an at least image region described in each initial pictures.
Due to on the corresponding initial pictures of flash of light, the subject in depth bounds corresponding with the flash of light be by
Proper exposure, the part subject corresponding image-region on the initial pictures should meet an at least exposure scale
Accurate (such as:Luminance standard, resolution standard etc.), therefore, in the present embodiment, can be the multiple initial according to what is obtained
The exposure effect in each region determines at least image region on each initial pictures on image.
After an at least image region for each initial pictures in obtaining the multiple initial pictures, it is suitable to select
Multiple images subregion carry out splicing fusion, wherein, in a kind of possible embodiment, the side between each image region
Boundary's pixel can use that integration technology is blurred or equalization is to keep the continuity of whole photo.
It is another in the embodiment of the present application in addition to the synthesis of described image is carried out according to the exposure effect of obtained initial pictures
, can also be according to corresponding to the target area shot every time in the corresponding scene to be captured in a kind of possible embodiment
The initial pictures on target image subregion carry out the synthesis of described image.Therefore, in this embodiment, it is described
Step S830 can include:
Determined according to depth information distribution and the multiple flashing pattern each initial in the multiple initial pictures
An at least target image subregion for image;
The multiple initial pictures are synthesized according to an at least target image subregion described in each initial pictures.
Wherein, alternatively, in a kind of possible embodiment, it is described determine each initial pictures it is described at least
One target image subregion includes:
Determine what is shot each time in the repeatedly shooting according to depth information distribution and the multiple flashing pattern
An at least target subject;
According to it is described shoot each time described in an at least target subject determine the institutes of each initial pictures
State an at least target image subregion.
Such as in the embodiment shown in Fig. 2-4c, it can be determined according to the depth map 300 of the scene to be captured each
On initial pictures respectively with 213 corresponding image region of the first object 211, the second object 212 and the 3rd object, according to each
The parameter such as the projection focal length of shooting and flashing pattern is it was determined that for example, the target subject of shooting is institute for the first time
The first object 211 is stated, the target subject of second of shooting is second object 212, the target quilt of third time shooting
Reference object is the 3rd object 213.
Therefore, as shown in figures 9 a-9 c, glistened and photographed with first projection focal length and the first flashing pattern
The first initial pictures 910 in, its target image subregion is the corresponding first object image region of first object 211
911 (target image subregion is represented with diagonal line hatches line);Likewise, with second projection focal length and the second flashing pattern
Target image subregion in the second initial pictures 920 for being glistened and being photographed is the second of 212 pairs of second object
Target image subregion 921;With the 3rd projection focal length and the 3rd flashing pattern are glistened and photograph the 3rd at the beginning of
Target image subregion in beginning image 930 is the 3rd target image subregion 931 of 213 pairs of the 3rd object.Such as Fig. 9 d
Shown, these three target image subregions are carried out synthesis can obtain each depth all by the composograph 940 of proper exposure.
It will be understood by those skilled in the art that in the above method of the application embodiment, the sequence number of each step
Size is not meant to the priority of execution sequence, and the execution sequence of each step should be determined with its function and internal logic, without answering
Any restriction is formed to the implementation process of the application embodiment.
As shown in Figure 10, the embodiment of the present application additionally provides a kind of image capture device 1000, including:
Flash of light parameter acquisition module 1010, for obtaining multiple projection focal lengths and the multiple projection in a scene to be captured
The corresponding multiple flashing patterns of focal length;
Flash module 1020, in response to a shooting instruction, multiple projections flash of light to be carried out to the scene to be captured;Its
In, each projection flash in multiple projections flash of light correspond to an at least projection focal length in the multiple projection focal length with
An and at least flashing pattern corresponding with an at least projection focal length in the multiple flashing pattern;
Image capture module 1030, for responding the shooting instruction, to the scene to be captured repeatedly shoot
To multiple initial pictures;Wherein, each shooting in the repeatedly shooting and each projection flash pair in the repeatedly flash of light
Should;
Processing module 1040, for synthesizing the multiple initial pictures.
The technical solution of the embodiment of the present application determines to use according to the Depth profile information of subject in scene to be captured
The projection focal length and flashing pattern glistened in the subject to different depth scope so as to the field to be captured
, can be to the quilt of different depth scope in the scene to be captured by the projection focal length and flashing pattern when scape is shot
Reference object carries out suitable light filling, and then collects the image of the good scene to be captured of exposure effect.
Each module of the embodiment of the present application is further detailed below:
Alternatively, as shown in fig. 11a, in a kind of possible embodiment of the embodiment of the present application, the flash of light parameter obtains
Modulus block 1010 can include:
Communicate submodule 1011, for obtaining the multiple projection focal length and the multiple flash of light from an at least external equipment
Pattern.
For example, in a kind of possible embodiment, described image collecting device 1000 can be a digital camera, user
Another user equipment, such as mobile phone or intelligent glasses obtain current scene to be captured by the depth transducer that itself is equipped with
Depth information, and multigroup flash of light parameter is obtained according to the depth information, described image collecting device 1000 by with institute
The communication for stating external equipment obtains the multiple projection focal length and the multiple flashing pattern.
Alternatively, as shown in figure 11b, in the alternatively possible embodiment of the embodiment of the present application, the flash of light parameter
Acquisition module 1010 can be the flash of light control device described in above-described embodiment, including:
Acquisition of information submodule 1012, is clapped for obtaining an at least subject in the scene to be captured relative to one
Take the photograph the Depth profile information of reference position;
Projection focal length determination sub-module 1013, for determining the multiple projection focal length according to the Depth profile information;
Flashing pattern determination sub-module 1014, for being determined and the multiple projection according at least to the Depth profile information
Focal length the multiple flashing pattern correspondingly.
In the embodiment of the present application, described information acquisition submodule 1012, projection focal length determination sub-module 1013 and sudden strain of a muscle
The function of light pattern determination sub-module 1014 is identical with the function and structure that submodule is corresponded in Fig. 5-Fig. 6 d illustrated embodiments, this
In repeat no more.
As shown in fig. 11c, in the alternatively possible embodiment of the embodiment of the present application, the flash of light parameter acquisition module
1010 further include:
Light intensity of glistening determination sub-module 1015, for determining the multiple flashing pattern one according to the multiple depth bounds
One corresponding multiple flash of light light intensity.
In the present embodiment, each projection flash in the multiple projections flash of light that the flash module 1020 carries out
Also correspond at least one flash of light light intensity corresponding with an at least flashing pattern in the multiple flash of light light intensity.
Alternatively, as shown in fig. 11a, in a kind of possible embodiment, the processing module 1040 includes:
First determination sub-module 1041, for determining an at least image for each initial pictures in the multiple initial pictures
Subregion;
First synthesis submodule 1042, at least image region synthesis according to each initial pictures
The multiple initial pictures.
Alternatively, as shown in figure 11b, in a kind of possible embodiment, the processing module 1040 includes:
Second determination sub-module 1043, described in being determined according to depth information distribution and the multiple flashing pattern
An at least target image subregion for each initial pictures in multiple initial pictures;
Second synthesis submodule 1044, for an at least target image subregion according to each initial pictures
Synthesize the multiple initial pictures.
In a kind of possible embodiment, as illustrated in fig. 11d, second determination sub-module 1043 includes:
Target determination unit 1043a, described in being determined according to depth information distribution and the multiple flashing pattern
At least target subject repeatedly shot each time in shooting;
Subregion determination unit 1043b, for according to described in shoot each time described in an at least target subject
Determine an at least target image subregion for each initial pictures.
In the embodiment of the present application, the further function description of each module, unit is referring to above-mentioned image-pickup method embodiment
In corresponding description, which is not described herein again.
Those of ordinary skill in the art may realize that each exemplary list described with reference to the embodiments described herein
Member and method and step, can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually
Performed with hardware or software mode, application-specific and design constraint depending on technical solution.Professional technician
Described function can be realized using distinct methods to each specific application, but this realization is it is not considered that exceed
Scope of the present application.
If the function is realized in the form of SFU software functional unit and is used as independent production marketing or in use, can be with
It is stored in a computer read/write memory medium.Based on such understanding, the technical solution of the application is substantially in other words
The part to contribute to the prior art or the part of the technical solution can be embodied in the form of software product, the meter
Calculation machine software product is stored in a storage medium, including some instructions are used so that a computer equipment (can be
People's computer, server, or network equipment etc.) perform each embodiment the method for the application all or part of step.
And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (ROM, Read-Only Memory), arbitrary access are deposited
Reservoir (RAM, Random Access Memory), magnetic disc or CD etc. are various can be with the medium of store program codes.
Embodiment of above is merely to illustrate the application, and is not the limitation to the application, in relation to the common of technical field
Technical staff, in the case where not departing from spirit and scope, can also make a variety of changes and modification, therefore all
Equivalent technical solution falls within the category of the application, and the scope of patent protection of the application should be defined by the claims.
Claims (43)
- A kind of 1. flash control method, it is characterised in that including:Obtain the Depth profile information of an at least subject in scene to be captured relative to a shooting reference position;Multiple projection focal lengths are determined according to the Depth profile information;Determined and the multiple projection focal length multiple flashing patterns correspondingly according at least to the Depth profile information.
- 2. the method as described in claim 1, it is characterised in that determine that the multiple projection is burnt according to the Depth profile information Away from including:Multiple depth models of the scene to be captured relative to the shooting reference position are determined according to the Depth profile information Enclose;The multiple projection focal length is determined according to the multiple depth bounds.
- 3. method as claimed in claim 2, it is characterised in that described to determine that the multiple flashing pattern includes:The shape of the multiple flashing pattern is determined according to the Depth profile information and the multiple depth bounds.
- 4. method as claimed in claim 3, it is characterised in that described to determine that multiple flashing patterns further include:The pattern intensity or light transmittance of the multiple flashing pattern are determined according to the multiple depth bounds.
- 5. method as claimed in claim 3, it is characterised in that the method further includes:The multiple flashing pattern multiple flash of light light intensity correspondingly are determined according to the multiple depth bounds.
- 6. the method as described in claim 1, it is characterised in that the Depth profile information includes a depth map.
- 7. the method as described in claim 1, it is characterised in that obtaining the Depth profile information includes:The Depth profile information is obtained by information gathering.
- 8. the method as described in claim 1, it is characterised in that obtaining the Depth profile information includes:The Depth profile information is obtained from an at least external equipment.
- 9. the method as described in claim 1, it is characterised in that the method further includes:In response to a shooting instruction, projection flash at least once is carried out to the scene to be captured, wherein, it is described to throw at least once Each projection flash in shadow flash of light corresponds to:An at least projection focal length in the multiple projection focal length, andAn at least flashing pattern corresponding with an at least projection focal length in the multiple flashing pattern.
- 10. method as claimed in claim 5, it is characterised in that the method further includes:In response to a shooting instruction, projection flash at least once is carried out to the scene to be captured, wherein, it is described to throw at least once Each projection flash in shadow flash of light corresponds to:An at least projection focal length in the multiple projection focal length,An at least flashing pattern corresponding with an at least projection focal length in the multiple flashing pattern, andAt least one flash of light light intensity corresponding with an at least flashing pattern in the multiple flash of light light intensity.
- 11. one kind flash of light control device, it is characterised in that including:Acquisition of information submodule, reference position is shot for obtaining an at least subject in scene to be captured relative to one Depth profile information;Projection focal length determination sub-module, for determining multiple projection focal lengths according to the Depth profile information;Flashing pattern determination sub-module, for being determined according at least to the Depth profile information with the multiple projection focal length one by one Corresponding multiple flashing patterns.
- 12. device as claimed in claim 11, it is characterised in that the projection focal length determination sub-module includes:Depth bounds determination unit, for determining the scene to be captured relative to the shooting according to the Depth profile information Multiple depth boundses of reference position;Projection focal length determination unit, for determining the multiple projection focal length according to the multiple depth bounds.
- 13. device as claimed in claim 12, it is characterised in that the flashing pattern determination sub-module includes:Pattern form determination unit, for determining the multiple sudden strain of a muscle according to the Depth profile information and the multiple depth bounds The shape of light pattern.
- 14. device as claimed in claim 13, it is characterised in that the flashing pattern determination sub-module further includes:Pattern intensity determination unit, for determining the pattern intensity of the multiple flashing pattern according to the multiple depth bounds.
- 15. device as claimed in claim 13, it is characterised in that the flashing pattern determination sub-module further includes:Pattern light transmittance determination unit, for determining the pattern printing opacity of the multiple flashing pattern according to the multiple depth bounds Degree.
- 16. device as claimed in claim 13, it is characterised in that described device further includes:Flash of light light intensity determination sub-module, for determining that the multiple flashing pattern is one-to-one according to the multiple depth bounds Multiple flash of light light intensity.
- 17. device as claimed in claim 11, it is characterised in that the Depth profile information includes a depth map;Described information acquisition submodule includes:Depth map acquiring unit, for obtaining the depth map.
- 18. device as claimed in claim 11, it is characterised in that described information acquisition submodule includes:Depth profile sensing unit, for gathering the Depth profile information.
- 19. device as claimed in claim 11, it is characterised in that described information acquisition submodule includes:Communication unit, for obtaining the Depth profile information from an at least external equipment.
- 20. device as claimed in claim 11, it is characterised in that described device further includes:First flash of light submodule, in response to a shooting instruction, projection flash at least once to be carried out to the scene to be captured, Wherein, each projection flash in the projection flash at least once corresponds to:An at least projection focal length in the multiple projection focal length, andThe corresponding at least flashing pattern of an at least projection focal length described in the multiple flashing pattern.
- 21. device as claimed in claim 16, it is characterised in that described device further includes:Second flash of light submodule, in response to a shooting instruction, projection flash at least once to be carried out to the scene to be captured, Wherein, each projection flash in the projection flash at least once corresponds to:An at least projection focal length in the multiple projection focal length,The corresponding at least flashing pattern of an at least projection focal length described in the multiple flashing pattern, andThe corresponding at least one flash of light light intensity of an at least flashing pattern described in the multiple flash of light light intensity.
- A kind of 22. image-pickup method, it is characterised in that including:Obtain the Depth profile information of an at least subject in scene to be captured relative to a shooting reference position;Multiple projection focal lengths are determined according to the Depth profile information;Determined and the multiple projection focal length the multiple flashing pattern correspondingly according at least to the Depth profile information;In response to a shooting instruction, multiple projections flash of light is carried out to the scene to be captured, and the scene to be captured is carried out Repeatedly shooting obtains multiple initial pictures, wherein, each projection flash in the multiple projections flash of light corresponds to the multiple It is corresponding extremely with an at least projection focal length in an at least projection focal length and the multiple flashing pattern in projection focal length A few flashing pattern;Each shooting in the repeatedly shooting is corresponding with each projection flash in the repeatedly flash of light;Synthesize the multiple initial pictures.
- 23. method as claimed in claim 22, it is characterised in that obtain the multiple projection focal length and the multiple flashlight view Case includes:The multiple projection focal length and the multiple flashing pattern are obtained from an at least external equipment.
- 24. method as claimed in claim 22, it is characterised in that the multiple projection is determined according to the Depth profile information Focal length includes:Multiple depth models of the scene to be captured relative to the shooting reference position are determined according to the Depth profile information Enclose;The multiple projection focal length is determined according to the multiple depth bounds.
- 25. method as claimed in claim 24, it is characterised in that described to determine that the multiple flashing pattern includes:The shape of the multiple flashing pattern is determined according to the Depth profile information and the multiple depth bounds.
- 26. method as claimed in claim 25, it is characterised in that described to determine that multiple flashing patterns further include:The pattern intensity or light transmittance of the multiple flashing pattern are determined according to the multiple depth bounds.
- 27. method as claimed in claim 22, it is characterised in that the Depth profile information includes a depth map.
- 28. method as claimed in claim 25, it is characterised in that the method further includes:The multiple flashing pattern multiple flash of light light intensity correspondingly are determined according to the multiple depth bounds.
- 29. the method as described in claim 28, it is characterised in that each projection flash in the multiple projections flash of light is also Corresponding at least one flash of light light intensity corresponding with an at least flashing pattern in the multiple flash of light light intensity.
- 30. method as claimed in claim 23, it is characterised in that the multiple initial pictures of synthesis include:Determine an at least image region for each initial pictures in the multiple initial pictures;The multiple initial pictures are synthesized according to an at least image region described in each initial pictures.
- 31. method as claimed in claim 24, it is characterised in that the multiple initial pictures of synthesis include:Each initial pictures in the multiple initial pictures are determined according to depth information distribution and the multiple flashing pattern An at least target image subregion;The multiple initial pictures are synthesized according to an at least target image subregion described in each initial pictures.
- 32. method as claimed in claim 31, it is characterised in that described to determine described at least the one of each initial pictures Target image subregion includes:Determine to shoot each time at least in the repeatedly shooting according to depth information distribution and the multiple flashing pattern One target subject;According to it is described shoot each time described in an at least target subject determine each initial pictures described in extremely A few target image subregion.
- A kind of 33. image capture device, it is characterised in that including:Acquisition of information submodule, for obtaining in scene to be captured extremely A few subject is relative to a Depth profile information for shooting reference position;Projection focal length determination sub-module, for determining multiple projection focal lengths according to the Depth profile information;Flashing pattern determination sub-module, for being determined according at least to the Depth profile information with the multiple projection focal length one by one Corresponding the multiple flashing pattern;Flash module, in response to a shooting instruction, multiple projections flash of light to be carried out to the scene to be captured;Wherein, it is described Each projection flash in multiple projections flash of light corresponds at least projection focal length and described in the multiple projection focal length An at least flashing pattern corresponding with an at least projection focal length in multiple flashing patterns;Image capture module, for responding the shooting instruction, obtains the multiple shooting of scene progress to be captured multiple first Beginning image;Wherein, each shooting in the repeatedly shooting is corresponding with each projection flash in the repeatedly flash of light;Processing module, for synthesizing the multiple initial pictures.
- 34. equipment as claimed in claim 33, it is characterised in that the flash of light parameter acquisition module includes:Communicate submodule, for obtaining the multiple projection focal length and the multiple flashing pattern from an at least external equipment.
- 35. equipment as claimed in claim 33, it is characterised in that the projection focal length determination sub-module includes:Depth bounds determination unit, for determining the scene to be captured relative to the shooting according to the Depth profile information Multiple depth boundses of reference position;Projection focal length determination unit, for determining the multiple projection focal length according to the multiple depth bounds.
- 36. equipment as claimed in claim 35, it is characterised in that the flashing pattern determination sub-module includes:Pattern form determination unit, for determining the multiple sudden strain of a muscle according to the Depth profile information and the multiple depth bounds The shape of light pattern.
- 37. equipment as claimed in claim 36, it is characterised in that the flashing pattern determination sub-module further includes:Pattern intensity determination unit, for determining the pattern intensity of the multiple flashing pattern according to the multiple depth bounds, OrPattern light transmittance determination unit, for determining the pattern printing opacity of the multiple flashing pattern according to the multiple depth bounds Degree.
- 38. equipment as claimed in claim 33, it is characterised in that the Depth profile information includes a depth map;Described information acquisition submodule includes:Depth map acquiring unit, for obtaining the depth map.
- 39. equipment as claimed in claim 36, it is characterised in that the flash of light parameter acquisition module further includes:Flash of light light intensity determination sub-module, for determining that the multiple flashing pattern is one-to-one according to the multiple depth bounds Multiple flash of light light intensity.
- 40. the equipment as described in claim 39, it is characterised in that the multiple projections flash of light that the flash module carries out In each projection flash also correspond to it is the multiple flash of light light intensity in it is corresponding with an at least flashing pattern at least one dodge Light light intensity.
- 41. equipment as claimed in claim 33, it is characterised in that the processing module includes:First determination sub-module, for determining an at least image region for each initial pictures in the multiple initial pictures;First synthesis submodule, it is the multiple at least image region synthesis according to each initial pictures Initial pictures.
- 42. equipment as claimed in claim 35, it is characterised in that the processing module includes:Second determination sub-module, it is the multiple initial for being determined according to depth information distribution and the multiple flashing pattern An at least target image subregion for each initial pictures in image;Second synthesis submodule, for described at least target image subregion synthesis according to each initial pictures Multiple initial pictures.
- 43. equipment as claimed in claim 42, it is characterised in that second determination sub-module includes:Target determination unit, for being determined according to depth information distribution and the multiple flashing pattern in the repeatedly shooting At least target subject shot each time;Subregion determination unit, for according to it is described shoot each time described in an at least target subject determine it is described often An at least target image subregion for a initial pictures.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410373832.8A CN104092955B (en) | 2014-07-31 | 2014-07-31 | Flash control method and control device, image-pickup method and collecting device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410373832.8A CN104092955B (en) | 2014-07-31 | 2014-07-31 | Flash control method and control device, image-pickup method and collecting device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104092955A CN104092955A (en) | 2014-10-08 |
CN104092955B true CN104092955B (en) | 2018-04-27 |
Family
ID=51640634
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410373832.8A Active CN104092955B (en) | 2014-07-31 | 2014-07-31 | Flash control method and control device, image-pickup method and collecting device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104092955B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104656348B (en) * | 2015-01-13 | 2016-10-05 | 苏州佳世达光电有限公司 | Projection arrangement and projecting method |
CN105472268B (en) * | 2015-12-24 | 2019-06-07 | Tcl集团股份有限公司 | A kind of shooting light compensation method and device |
CN105915810B (en) * | 2016-03-31 | 2020-02-21 | 联想(北京)有限公司 | Control method of electronic equipment and electronic equipment |
CN107241557A (en) * | 2017-06-16 | 2017-10-10 | 广东欧珀移动通信有限公司 | Image exposure method, device, picture pick-up device and storage medium |
CN107493432B (en) * | 2017-08-31 | 2020-01-10 | Oppo广东移动通信有限公司 | Image processing method, image processing device, mobile terminal and computer readable storage medium |
CN107509031B (en) * | 2017-08-31 | 2019-12-27 | Oppo广东移动通信有限公司 | Image processing method, image processing device, mobile terminal and computer readable storage medium |
CN108174085A (en) * | 2017-12-19 | 2018-06-15 | 信利光电股份有限公司 | A kind of image pickup method of multi-cam, filming apparatus, mobile terminal and readable storage medium storing program for executing |
CN108564614B (en) * | 2018-04-03 | 2020-09-18 | Oppo广东移动通信有限公司 | Depth acquisition method and apparatus, computer-readable storage medium, and computer device |
CN110868528B (en) * | 2019-11-22 | 2021-03-23 | 维沃移动通信有限公司 | Light supplement control method and electronic equipment |
CN113132639B (en) * | 2021-04-22 | 2023-02-03 | 亮风台(上海)信息科技有限公司 | Image processing method and device, electronic equipment and storage medium |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3873157B2 (en) * | 1997-11-13 | 2007-01-24 | カシオ計算機株式会社 | Electronic camera device and imaging method |
US20040100573A1 (en) * | 2002-11-21 | 2004-05-27 | Osamu Nonaka | Focusing apparatus and camera including the same |
CN101489050B (en) * | 2008-01-15 | 2011-07-20 | 华晶科技股份有限公司 | Automatic exposure control method |
CN102036017B (en) * | 2009-09-30 | 2012-07-25 | 华晶科技股份有限公司 | Method for controlling flashing module |
CN102131051B (en) * | 2010-12-28 | 2012-11-28 | 惠州Tcl移动通信有限公司 | Image pick-up equipment and image acquisition method and device thereof |
CN102843506A (en) * | 2011-06-24 | 2012-12-26 | 瑞轩科技股份有限公司 | Camera system and image shooting and synthesizing method thereof |
TWI461813B (en) * | 2012-02-24 | 2014-11-21 | Htc Corp | Image capture method and image capture system thereof |
CN103543575A (en) * | 2012-07-10 | 2014-01-29 | 宏碁股份有限公司 | Image acquisition device and light source assisted photographing method |
-
2014
- 2014-07-31 CN CN201410373832.8A patent/CN104092955B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN104092955A (en) | 2014-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104092955B (en) | Flash control method and control device, image-pickup method and collecting device | |
US7262798B2 (en) | System and method for simulating fill flash in photography | |
CN104754203B (en) | Image pickup method, device and terminal | |
JP6946188B2 (en) | Methods and equipment for multi-technology depth map acquisition and fusion | |
CN104113702B (en) | Flash control method and control device, image-pickup method and harvester | |
CN109792478A (en) | System and method based on focus target information adjustment focus | |
CN104092956B (en) | Flash control method, flash of light control device and image capture device | |
WO2019056242A1 (en) | Camera photographing parameter setting method for smart terminal, setting device, and smart terminal | |
CN103971547B (en) | Photography artificial teaching method and system based on mobile terminal | |
CN104580878A (en) | Automatic effect method for photography and electronic apparatus | |
CN104301598A (en) | Method of setting lighting effect of front camera by mobile terminal | |
CN104618637B (en) | Illumination supplement adjusting method and photographic device applying same | |
CN110336945A (en) | A kind of intelligence assisted tomography patterning process and system | |
CN104092954B (en) | Flash control method and control device, image-pickup method and harvester | |
CN109565577A (en) | Colour correcting apparatus, color calibration system, colour correction hologram, color correcting method and program | |
CN108347505A (en) | Mobile terminal with 3D imaging functions and image generating method | |
JP6965132B2 (en) | Image processing equipment, imaging equipment, image processing methods and programs | |
CN109618089A (en) | Intelligentized shooting controller, Management Controller and image pickup method | |
CN102629973B (en) | Camera head and image capture method | |
CN109600556A (en) | A kind of high quality precision omnidirectional imaging system and method based on slr camera | |
CN108322728A (en) | Computer with scanning function and model generating method | |
JP2020504969A (en) | Intelligent shooting method and device, intelligent terminal | |
CN106878606A (en) | A kind of image generating method and electronic equipment based on electronic equipment | |
CN108377327B (en) | Panoramic camera and depth information acquisition method | |
CN105933618A (en) | Photographing method and system, and devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |