CN104580886A - Photographing control method and device - Google Patents

Photographing control method and device Download PDF

Info

Publication number
CN104580886A
CN104580886A CN201410776081.4A CN201410776081A CN104580886A CN 104580886 A CN104580886 A CN 104580886A CN 201410776081 A CN201410776081 A CN 201410776081A CN 104580886 A CN104580886 A CN 104580886A
Authority
CN
China
Prior art keywords
destination object
face
light
terminal
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410776081.4A
Other languages
Chinese (zh)
Other versions
CN104580886B (en
Inventor
唐明勇
刘华一君
陈涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Technology Co Ltd
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201410776081.4A priority Critical patent/CN104580886B/en
Publication of CN104580886A publication Critical patent/CN104580886A/en
Application granted granted Critical
Publication of CN104580886B publication Critical patent/CN104580886B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a photographing control method and a photographing control device, which are applied to a terminal. The photographing control method comprises the following steps: recognizing a face direction of a target object photographed by the terminal; according to the face direction of the target object, determining a target light-supplementing direction; supplementing light in the target light-supplementing direction to obtain a target image of the target object. Through application of the photographing control method, when a preset photographing effect is achieved, light can be supplemented according to the target light-supplementing direction corresponding to the face direction of the target object before or during photography, and in such a manner, in the obtained target image, ambient light around the target object reaches an ideal state, so that satisfactory and high-quality photos can be obtained, the process of regulating and processing the photos after the photography is omitted, and the efficiency of obtaining the high-quality photos is improved.

Description

Filming control method and device
Technical field
The disclosure relates to technical field of intelligent equipment, particularly relates to a kind of filming control method and device.
Background technology
Universal along with the mobile terminal such as smart mobile phone, panel computer product, capture video, uploads to video website by video and becomes incomparably convenient.More and more user has been accustomed to the thing recorded by mobile terminal capture video or picture in daily life.
In the epoch of this fast food type, sharing efficiency becomes the thing that people take notice of.For domestic consumer, the photo that obtains of directly taking pictures is often unsatisfactory, thus take pictures obtain photo after also need one or more photo disposal mode comparison films to adjust, just can obtain the photo that quality is satisfied.
Summary of the invention
For overcoming Problems existing in correlation technique, the disclosure provides a kind of filming control method and device.
According to the first aspect of disclosure embodiment, provide a kind of filming control method, be applied in terminal, described in comprise:
Identify the face orientation of the destination object of described terminal taking;
According to the face orientation determination target light filling direction of described destination object;
Light filling is carried out in described target light filling direction, obtains the target image of described destination object.
In conjunction with first aspect, in the first possible implementation of first aspect, the face orientation of the destination object of the described terminal taking of described identification, comprising:
Obtain and carry out taking the reference picture in front described terminal view finder to described destination object;
Image procossing is carried out to described reference picture, determines the face orientation of destination object in described reference picture according to processing result image.
In conjunction with the first possible implementation of first aspect, in the implementation that first aspect the second is possible, described image procossing is carried out to described reference picture, determines the face orientation of destination object in described reference picture according to processing result image, comprising:
Feature identification is carried out to described reference picture, judges whether include facial characteristics in described reference picture;
When including facial characteristics in described reference picture, face recognition being carried out to the facial characteristics in described reference picture, obtaining face recognition result;
The face orientation of described destination object is determined according to described face recognition result.
In conjunction with the implementation that first aspect the second is possible, in the third possible implementation of first aspect, the described face orientation determining described destination object according to described face recognition result, comprising:
Extract the characteristic parameter of face in described facial characteristics, described characteristic parameter at least comprises: the perspective size of face;
According to the ratio in the calculation of characteristic parameters facial characteristics of face in described facial characteristics between face;
According to the ratio in described facial characteristics between face, determine the face orientation of described destination object.
In conjunction with the implementation that first aspect the second is possible, in first aspect the 4th kind of possible implementation, the described face orientation determining described destination object according to described face recognition result, comprising:
Extract the position relationship between hair and face in described facial characteristics;
According to the position relationship between described hair and face, determine the face orientation of described destination object.
In conjunction with first aspect, in first aspect the 5th kind of possible implementation, the face orientation of described destination object is the direction of the relatively described terminal of described destination object, and described target light filling direction is the direction of the relatively described terminal of default light compensating lamp.
In conjunction with first aspect the 5th kind of possible implementation, in first aspect the 6th kind of possible implementation, the described face orientation determination target light filling direction according to described destination object, comprising:
Obtain the mapping table of the direction of illumination of the face orientation described terminal relative to described default light compensating lamp preset;
In described mapping table, the direction of illumination of the relatively described terminal of described default light compensating lamp is searched according to the face orientation of described destination object;
The direction of illumination of relatively described for the described default light compensating lamp found terminal is defined as target light filling direction.
In conjunction with first aspect the 6th kind of possible implementation, in first aspect the 7th kind of possible implementation, the described face orientation according to described destination object searches the direction of illumination of the relatively described terminal of described default light compensating lamp in described mapping table, comprising:
When the face orientation of described destination object is the direction, left of described terminal, determine that the direction of illumination of described default light compensating lamp is the direction, upper left side of described terminal;
When the face orientation of described destination object is the direction, right of described terminal, determine that the direction of illumination of described default light compensating lamp is the direction, upper right side of described terminal;
When the face orientation of described destination object is the direction of described terminal faced by forward, determine that the direction of illumination of described default light compensating lamp is the direction, top of described terminal.
In conjunction with first aspect the 6th kind of possible implementation, in first aspect the 8th kind of possible implementation, in described mapping table, a corresponding multiple direction of illumination of face orientation;
The described face orientation determination target light filling direction according to described destination object, also comprises:
Detect the light source point position that multiple direction of illuminations of the relatively described terminal of described default light compensating lamp found are corresponding respectively;
Choose the direction of illumination of the direction of illumination of light source point position outside the face of described destination object as described default light compensating lamp.
In conjunction with first aspect the 5th kind to the 8th kind any one possible implementation, in first aspect the 9th kind of possible implementation, describedly on described target light filling direction, carry out light filling, obtain the target image of described destination object, comprising:
Control described default light compensating lamp to irradiate according to described direction of illumination, to change the surround lighting around described destination object;
At described default light compensating lamp according to when irradiating according to described direction of illumination, control described terminal and described destination object is taken, obtain the target image of described destination object.
In conjunction with first aspect, in first aspect the tenth kind of possible implementation, the face orientation of described destination object is the direction of the relatively described reference picture of described destination object, and described target light filling direction is the simulation direction of the relatively described reference picture of simulated light.
In conjunction with first aspect the tenth kind of possible implementation, in first aspect the 11 kind of possible implementation, describedly on described target light filling direction, carry out light filling, obtain the target image of described destination object, comprising:
The target simulation light that generation direction of illumination and described simulation direction match;
When described terminal is taken described destination object, the image that shooting obtains adds described target simulation light, obtains the target image of described destination object.
In conjunction with first aspect the 11 kind of possible implementation, in first aspect the 12 kind of possible implementation, the target simulation light that described generation direction of illumination and described simulation direction match, comprising:
Described simulation direction generates multiple with reference to simulated light according to predetermined manner, the position of multiple simulated light source point with reference to simulated light is different;
On described destination object face one is not positioned at reference to simulated light as target simulation light with reference to choosing simulated light source location in simulated light multiple.
According to the second aspect of disclosure embodiment, provide a kind of imaging control device, be applied in terminal, described device comprises:
Towards identification module, for identifying the face orientation of the destination object of described terminal taking;
Light filling direction determination module, for the face orientation determination target light filling direction according to described destination object;
Shooting control module, for carrying out light filling on described target light filling direction, obtains the target image of described destination object.
In conjunction with second aspect, in the first possible implementation of second aspect, describedly to comprise towards identification module:
Reference picture obtains submodule, carries out taking the reference picture in front described terminal view finder to described destination object for obtaining;
Image procossing submodule, for carrying out image procossing to described reference picture, determines the face orientation of destination object in described reference picture according to processing result image.
In conjunction with the first possible implementation of second aspect, in the implementation that second aspect the second is possible, described image procossing submodule, comprising:
Feature recognin module, for carrying out feature identification to described reference picture, judges whether include facial characteristics in described reference picture;
Face recognition submodule, for when including facial characteristics in described reference picture, carrying out face recognition to the facial characteristics in described reference picture, obtaining face recognition result;
Towards determining submodule, for determining the face orientation of described destination object according to described face recognition result.
In conjunction with the implementation that second aspect the second is possible, in the third possible implementation of second aspect, described towards determining submodule, comprising:
Characteristic parameter extraction submodule, for extracting the characteristic parameter of face in described facial characteristics, described characteristic parameter at least comprises: the perspective size of face;
Calculating sub module, for according to the ratio in the calculation of characteristic parameters facial characteristics of face in described facial characteristics between face;
First determines submodule, for according to the ratio in described facial characteristics between face, determines the face orientation of described destination object.
In conjunction with the implementation that second aspect the second is possible, in second aspect the 4th kind of possible implementation, described towards determining submodule, comprising:
Position relationship determination submodule, for extracting the position relationship in described facial characteristics between hair and face;
Second determines submodule, for according to the position relationship between described hair and face, determines the face orientation of described destination object.
In conjunction with second aspect, in second aspect the 5th kind of possible implementation, the face orientation of described destination object is the direction of the relatively described terminal of described destination object, and described target light filling direction is the direction of the relatively described terminal of default light compensating lamp;
Described light filling direction determination module, comprising:
Corresponding relation obtains submodule, for obtaining the mapping table of the direction of illumination of default face orientation described terminal relative to described default light compensating lamp;
Search submodule, for searching the direction of illumination of the relatively described terminal of described default light compensating lamp in described mapping table according to the face orientation of described destination object;
Submodule is determined in light filling direction, for the direction of illumination of relatively described for the described default light compensating lamp found terminal is defined as target light filling direction.
In conjunction with second aspect the 5th kind of possible implementation, in second aspect the 6th kind of possible implementation, described in search submodule, comprising:
3rd determines submodule, for when the face orientation of described destination object is the direction, left of described terminal, determines that the direction of illumination of described default light compensating lamp is the direction, upper left side of described terminal;
4th determines submodule, for when the face orientation of described destination object is the direction, right of described terminal, determines that the direction of illumination of described default light compensating lamp is the direction, upper right side of described terminal;
5th determines submodule, for being forward when the face orientation of described destination object when the direction of described terminal, determines that the direction of illumination of described default light compensating lamp is the direction, top of described terminal.
In conjunction with second aspect the 5th kind of possible implementation, in second aspect the 7th kind of possible implementation, in described mapping table, a corresponding multiple direction of illumination of face orientation;
Described light filling direction determination module, also comprises:
Position probing submodule, the light source point position that the multiple direction of illuminations for detecting the relatively described terminal of the described default light compensating lamp found are corresponding respectively;
Choose submodule, for choosing the direction of illumination of the direction of illumination of light source point position outside the face of described destination object as described default light compensating lamp.
In conjunction with second aspect the 4th kind to the 7th kind possible implementation, in second aspect the 8th kind of possible implementation, described shooting control module, comprising:
Direction of illumination controls submodule, irradiates for controlling described default light compensating lamp, to change the surround lighting around described destination object according to described direction of illumination;
First shooting submodule, when irradiating according to described direction of illumination for shining at described default light compensating lamp, controlling described terminal and taking described destination object, obtaining the target image of described destination object.
In conjunction with second aspect, in second aspect the 9th kind of possible implementation, the face orientation of described destination object is the direction of the relatively described reference picture of described destination object, and described target light filling direction is the simulation direction of the relatively described reference picture of simulated light;
Described shooting control module, comprising:
Simulated light generates submodule, for generating the target simulation light that direction of illumination and described simulation direction match;
Second shooting submodule, for when described terminal is taken described destination object, the image that shooting obtains adds described target simulation light, obtains the target image of described destination object.
In conjunction with second aspect the 9th kind of possible implementation, in second aspect the tenth kind of possible implementation, described simulated light generates submodule, comprising:
Generate submodule with reference to simulated light, multiple with reference to simulated light for generating according to predetermined manner on described simulation direction, the position of multiple simulated light source point with reference to simulated light is different;
Target simulation light chooser module, for not being positioned on described destination object face one with reference to simulated light as target simulation light multiple with reference to choosing simulated light source location in simulated light.
According to the third aspect of disclosure embodiment, a kind of terminal is provided, comprises:
Processor;
For the memory of storage of processor executable instruction;
Wherein, described processor is configured to:
Identify the face orientation of the destination object of described terminal taking;
According to the face orientation determination target light filling direction of described destination object;
Light filling is carried out in described target light filling direction, obtains the target image of described destination object.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect:
The method that disclosure embodiment provides, when taking destination object, a target light filling direction matched can be determined according to the face orientation of the destination object be taken, and carry out light filling, to change the ambient light around destination object in this target light filling direction.
Compared with correlation technique, application the method, when realizing reaching a predetermined shooting effect, just light filling is carried out according to the target light filling corresponding with the face orientation of destination object before can taking pictures or the while of taking pictures, make in the target image obtained like this, surround lighting around destination object reaches perfect condition, therefore once just can complete and make customer satisfaction system, high-quality photo, eliminate that comparison film after taking pictures carries out adjusting, processing procedure, improve the efficiency obtaining high-quality photos.
Should be understood that, it is only exemplary and explanatory that above general description and details hereinafter describe, and can not limit the disclosure.
Accompanying drawing explanation
Accompanying drawing to be herein merged in specification and to form the part of this specification, shows embodiment according to the invention, and is used from specification one and explains principle of the present invention.
Fig. 1 is the flow chart of a kind of filming control method according to an exemplary embodiment.
Fig. 2 is a kind of detailed process schematic diagram of step S101.
Fig. 3 is a kind of detailed process schematic diagram of step S1012.
Fig. 4 is a kind of detailed process schematic diagram of step S10123.
Fig. 5 is the another kind of detailed process schematic diagram of step S10123.
Fig. 6 is a kind of detailed process schematic diagram of step S102.
Fig. 7 is the another kind of detailed process schematic diagram of step S102.
Fig. 8 is a kind of detailed process schematic diagram of step S103.
Fig. 9 illustrates a kind of scene schematic diagram according to an exemplary embodiment.
Figure 10 illustrates another kind of scene schematic diagram according to an exemplary embodiment.。
Figure 11 illustrates another scene schematic diagram according to an exemplary embodiment.
Figure 12 is the another kind of detailed process schematic diagram of step S103.
Figure 13 is a kind of detailed process schematic diagram of step S1033.
Figure 14 is the structural representation of a kind of imaging control device according to an exemplary embodiment.
Figure 15 is towards a kind of detailed construction schematic diagram of identification module.
Figure 16 is a kind of detailed construction schematic diagram of image procossing submodule.
Figure 17 is towards determining a kind of detailed construction schematic diagram of submodule.
Figure 18 is towards determining the another kind of detailed construction schematic diagram of submodule.
Figure 19 is a kind of detailed construction schematic diagram of light filling direction determination module.
Figure 20 is the another kind of detailed construction schematic diagram of light filling direction determination module.
Figure 21 is a kind of detailed construction schematic diagram of shooting control module.
Figure 22 is the another kind of detailed construction schematic diagram of shooting control module.
Figure 23 is a kind of detailed construction schematic diagram that simulated light generates submodule.
Figure 24 is the structural representation of a kind of terminal according to an exemplary embodiment.
Embodiment
Here will be described exemplary embodiment in detail, its sample table shows in the accompanying drawings.When description below relates to accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawing represents same or analogous key element.Execution mode described in following exemplary embodiment does not represent all execution modes consistent with the present invention.On the contrary, they only with as in appended claims describe in detail, the example of apparatus and method that aspects more of the present invention are consistent.
This filming control method that disclosure embodiment provides is applied in terminal, and is at least provided with camera in terminal, such as: the post-positioned pick-up head that mobile phone is arranged or front-facing camera.In addition, terminal can also be provided with light compensating lamp, such as: photoflash lamp mobile phone carrying setting, or, terminal also can be connected with independently flash device, such as: the Supported Speedlights etc. that digital-to-analogue camera is arranged, but the light compensating lamp of no matter which kind of form, in the disclosed embodiments, the direction of illumination of light compensating lamp, irradiating angle can carry out free adjustment.
Fig. 1 is the flow chart of a kind of filming control method according to an exemplary embodiment, and as shown in Figure 1, the method can comprise the following steps:
In step S101, identify the face orientation of the destination object of described terminal taking.
In step s 102, according to the face orientation determination target light filling direction of described destination object.
In step s 103, light filling is carried out in described target light filling direction, obtains the target image of described destination object.
The method that disclosure embodiment provides, when taking destination object, a target light filling direction matched can be determined according to the face orientation of the destination object be taken, and carry out light filling, to change the ambient light around destination object in this target light filling direction.
Compared with correlation technique, application the method, when realizing reaching a predetermined shooting effect, just light filling is carried out according to the target light filling corresponding with the face orientation of destination object before can taking pictures or the while of taking pictures, make in the target image obtained like this, surround lighting around destination object reaches perfect condition, therefore once just can complete and make customer satisfaction system, high-quality photo, eliminate that comparison film after taking pictures carries out adjusting, processing procedure, improve the efficiency obtaining high-quality photos.
In another embodiment of the disclosure, as shown in Figure 2, above-mentioned steps S101 can comprise the following steps.
In step S1011, obtain and carry out taking the reference picture in front described terminal view finder to described destination object.
At present, common electronic device terminal, when taking, capital presents the image in viewfinder range in real time, such as: for mobile phone, directly show the image in current viewfinder range on a display of the handset, for digital camera or DV video tape recorder, also can image in the current viewfinder range of electronical display screen display thereon.
In this step, the image of acquisition can real scene image before terminal taking in camera, such as: can pass to the real scene image of display screen from camera the reference picture extracting a certain frame and get as this step.Certainly, the image that also terminal can be obtained after a certain subnormal shooting action is as reference image.
In step S1012, image procossing is carried out to described reference picture, determine the face orientation of destination object in described reference picture according to processing result image.
After getting real scene image, by means such as image procossing, the face of destination object in image can be identified, then by carrying out further image procossing to face, tell the face of destination object towards.
Destination object can be people, such as: the statue of ordinary people or people, also can be common animal, such as: cat, dog, horse etc.Because the face of different destination objects is different, so, in a particular application, the face orientation model of different target object can be set up in advance, and then in this step can by the result after image procossing and the face orientation model set up in advance are contrasted, to identify the face orientation of destination object.
In the disclosed embodiments, the face orientation of destination object can be the direction of relative termination, is namely reference substance with terminal, such as: with terminal camera place one side for front, or, with the shooting direction of camera for front, and with the relative the opposing party in front for rear.So just can make when the follow-up direction of illumination to light compensating lamp controls, can be reference substance with terminal equally, avoid occurring direction entanglement.Certainly, in other embodiments, the photographer of handheld terminal can also be selected for reference substance or other reference substance.
In other embodiment of the disclosure, the face orientation of destination object can also be the direction of relatively described reference picture, namely in the plane at reference picture place, be previously provided with reference frame, the face orientation being arranged in the destination object on reference picture like this can accurately be determined to obtain at this reference frame.
In another embodiment of the disclosure, as shown in Figure 3, above-mentioned steps S1012 can comprise the following steps.
In step S10121, feature identification is carried out to described reference picture, judge whether include facial characteristics in described reference picture.
In this step, feature recognition rule can be pre-set, so just can directly utilize feature recognition rule to carry out feature identification to reference picture.
In the disclosed embodiments, feature recognition rule can be human body face feature recognition rule, certainly considers the difference needs of user, can also increase or change feature recognition rule, such as: dog is taken, the facial characteristics recognition rule of dog can also be increased.
When including facial characteristics in described reference picture, in step S10122, face recognition being carried out to the facial characteristics in described reference picture, obtaining face recognition result.
Abovementioned steps S10121 identifies according to feature recognition rule, extracts the recognition result including the feature of face in image.But for the details parameter of each facial characteristics inside, also need to carry out further face recognition, to obtain the details parameter comprised in facial characteristics.
In step S10123, determine the face orientation of described destination object according to described face recognition result.
In one embodiment, as shown in Figure 4, this step S10123 can comprise the following steps:
In step s 11, extract the characteristic parameter of face in described facial characteristics, described characteristic parameter at least comprises: the perspective size of face.
Face refer to: ear, eyebrow, eye, nose and mouth, be shot for example with personage, when including personage facial in image, have wherein that one or more occur on face in the picture in these face at least.
In step s 12, according to the ratio in the calculation of characteristic parameters facial characteristics of face in described facial characteristics between face.
With terminal place plane for reference, the object size in the picture far away apart from this plane is less, and the object size in the picture nearer apart from this plane is larger.Like this, for front shooting personage, when facial orientation direction is different, the ratio between face is not identical yet, and when the face of the personage that is taken is just to camera, for the eyes in face, two eyes ratio is in the picture identical; For the ear in face, two ears ratio is in the picture also identical.When face's lateral face of the personage that is taken is to camera, two eyes ratio in the picture will change, and two ears may only have one to occur in the picture.
In step s 13, according to the ratio in described facial characteristics between face, determine the face orientation of described destination object.
By after the ratio that calculates in facial characteristics between face, according to the corresponding relation of the ratio between the face pre-set and face orientation, the face orientation of destination object can be determined.
In another kind of embodiment, as shown in Figure 5, this step S10123 can comprise the following steps:
In the step s 21, the position relationship in described facial characteristics between hair and face is extracted.
Position relationship between hair and face, including, but not limited to: each several parts such as hair line, plait, head curtain and facial position relationship.
In step S22, according to the position relationship between described hair and face, determine the face orientation of described destination object.
When determining the face orientation of destination object with the position between hair and face, the hair of a large amount of personnel and the material of face can be obtained in advance, and training in advance goes out hair and a facial corresponding relation model comparatively accurately, and then in this step, this corresponding relation module can be utilized accurately to determine the face orientation of destination object.
In the disclosed embodiments, the face orientation of destination object can be the direction of relative termination, is namely reference substance with terminal, such as: with terminal camera place one side for front, or, with the shooting direction of camera for front, and with the relative the opposing party in front for rear.Correspondingly, as shown in Figure 6, above-mentioned steps S102 can comprise the following steps:
In step S1021, obtain the mapping table of the direction of illumination of the face orientation described terminal relative to described default light compensating lamp preset.
In an application mode, this mapping table can be as shown in table 1 below:
Table 1:
Face orientation The direction of illumination of light compensating lamp
Relative termination is towards left Relative termination is towards upper left
Relative termination is towards left Relative termination is towards upper right
Relative termination is in the face of terminal Relative termination is towards just going up
…… ……
In step S1022, in described mapping table, search the direction of illumination of the relatively described terminal of described default light compensating lamp according to the face orientation of described destination object.
According to upper table, in this step, the direction of illumination of terminal directly can be obtained by the mode of tabling look-up.
In an application mode, this step S1022 can comprise the following steps.
31), when the face orientation of described destination object is the direction, left of described terminal, determine that the direction of illumination of described default light compensating lamp is the direction, upper left side of described terminal.
32), when the face orientation of described destination object is the direction, right of described terminal, determine that the direction of illumination of described default light compensating lamp is the direction, upper right side of described terminal;
33), when the face orientation of described destination object is the direction of described terminal faced by forward, determine that the direction of illumination of described default light compensating lamp is the direction, top of described terminal.
In upper table 1, the direction of illumination of light compensating lamp and the corresponding relation of face orientation, its main purpose is to make the light filling of destination object face even, avoiding overexposure.In other embodiment of the disclosure, the direction of illumination of light compensating lamp and the corresponding relation of face orientation can also freely adjust according to actual needs, such as: completely according to direction contrary in table 1, or, in order to reach certain artistic effect, the multiple different direction of illumination of light compensating lamp can also be selected for same face orientation.I.e. table 1 illustrated embodiment, should not form the restriction to the application.
In the above-described embodiments, although the direction of illumination of light compensating lamp can be changed, to make the light filling of the face of destination object even, if the face of light compensating lamp direct irradiation destination object, still there will be the situation of overexposure.For this reason, in the disclosed embodiments, in the mapping table of the direction of illumination of the face orientation described terminal relative to described default light compensating lamp preset, a corresponding multiple direction of illumination of face orientation, such as: the face orientation of destination object is terminal left, so, the direction of illumination of corresponding light compensating lamp can be, 45 degree, left and 45 degree, top, 30 degree, left and 30 degree, top, and, 30 degree, left and three directions, 45 degree, top.And as shown in Figure 7, above-mentioned steps S102 can comprise the following steps:
In step S1023, detect the light source point position that multiple direction of illuminations of the relatively described terminal of described default light compensating lamp found are corresponding respectively.
In the disclosed embodiments, light source point position can be the position of axis on destination object face or outside destination object face of light compensating lamp institute irradiation light.
In step S1024, choose the direction of illumination of the direction of illumination of light source point position outside the face of described destination object as described default light compensating lamp.
Once the position of light source point is in the facial zone of destination object, the face of destination object in the image obtained after taking pictures so certainly will may be caused to occur overexposure.Therefore, in this step, can selective light source location be positioned at described destination object face outside a direction of illumination as the direction of illumination of final light compensating lamp.
In another embodiment of the disclosure, as shown in Figure 8, the step S103 in embodiment illustrated in fig. 1 can comprise the following steps.
In step S1031, control described default light compensating lamp and irradiate according to described direction of illumination, to change the surround lighting around described destination object.
Comprise for Fig. 9, figure: the reference picture 3 that mobile phone 1, destination object 2, shooting obtain.When utilizing the post-positioned pick-up head of terminal to take pictures, the face orientation of relative termination user is left, and so the direction of illumination of light compensating lamp can be the upper left side of relative termination, and in Fig. 9, region shown in dotted line is the irradiation area of light compensating lamp.As shown in Figure 10, utilize the post-positioned pick-up head of terminal to carry out taking pictures into example equally, the face orientation of relative termination user is right, and so the direction of illumination of light compensating lamp can be the upper right side of relative termination.Or as shown in figure 11, utilize the post-positioned pick-up head of terminal to carry out taking pictures into example equally, when user is in the face of terminal, now, the direction of illumination of light compensating lamp can be directly over relative termination.
In Fig. 9 ~ Figure 11, the direction of illumination of light compensating lamp and the corresponding relation of face orientation, its main purpose is to make the light filling of destination object face even, avoiding overexposure.In other embodiment of the disclosure, the direction of illumination of light compensating lamp and the corresponding relation of face orientation can also freely adjust according to actual needs, such as: completely according to direction contrary in Fig. 9 ~ Figure 11, or, in order to reach certain artistic effect, the multiple different direction of illumination of light compensating lamp can also be selected for same face orientation.I.e. scene shown in Fig. 9 ~ Figure 11, should not form the restriction to the application.
In step S1032, at described default light compensating lamp according to when irradiating according to described direction of illumination, control described terminal and described destination object is taken, obtain the target image of described destination object.
In Fig. 9 ~ Figure 11, default light compensating lamp is the photoflash lamp on mobile phone, like this when after the direction of illumination determining light compensating lamp, default light compensating lamp can be controlled, as shown in Fig. 9 ~ Figure 11, just can control default light compensating lamp to throw light on according to determined direction of illumination, and then the surround lighting around destination object 2 can be changed.
The method that disclosure embodiment provides, when taking pictures to destination object, light compensating lamp is no longer the carrying out illumination light filling according to fixing direction of illumination, but can change the direction of illumination of light compensating lamp according to the face orientation of the destination object be taken.
Compared with correlation technique, application the method, when realizing reaching a predetermined shooting effect, light filling is carried out with regard to controlling light compensating lamp according to the direction of illumination corresponding with the face orientation of destination object before can taking pictures or the while of taking pictures, and then when follow-up taking pictures, surround lighting around destination object reaches perfect condition, therefore once just can complete and make customer satisfaction system, high-quality photo, eliminate that comparison film after taking pictures carries out adjusting, processing procedure, improve the efficiency obtaining high-quality photos.
In another embodiment of the disclosure, the face orientation of described destination object is the direction of the relatively described reference picture of described destination object, described target light filling direction is the simulation direction of the relatively described reference picture of simulated light, so as shown in figure 12, the step S103 in embodiment illustrated in fig. 1 can comprise the following steps.
In step S1033, the target simulation light that generation direction of illumination and described simulation direction match;
In step S1034, when described terminal is taken described destination object, the image that shooting obtains adds described target simulation light, obtains the target image of described destination object.
Wherein, as shown in figure 13, step S1033 can comprise the following steps.
In step S10331, described simulation direction generates multiple with reference to simulated light according to predetermined manner, the position of multiple simulated light source point with reference to simulated light is different;
In step S10332, be not positioned on described destination object face one with reference to simulated light as target simulation light multiple with reference to choosing simulated light source location in simulated light.
In this step, one outside the face that virtual optical source location can be selected to be positioned at described destination object with reference to simulated light as target simulation light.Avoid occurring overexposure.
In the disclosed embodiments, the relation between the simulation direction of target simulation light and the face orientation of destination object, can see above-mentioned embodiment illustrated in fig. 6 in description, description of again illustrating no longer in detail.
Figure 14 is the structural representation of a kind of imaging control device according to an exemplary embodiment.With reference to Figure 14, this device comprises: towards identification module 11, light filling direction determination module 12 and shooting control module 13, wherein:
The face orientation of the destination object identifying described terminal taking is configured to towards identification module 11;
Light filling direction determination module 12 is configured to the face orientation determination target light filling direction according to described destination object;
Shooting control module 13 is configured to carry out light filling on described target light filling direction, obtains the target image of described destination object.
The method that disclosure embodiment provides, when taking destination object, a target light filling direction matched can be determined according to the face orientation of the destination object be taken, and carry out light filling, to change the ambient light around destination object in this target light filling direction.
Compared with correlation technique, application the method, when realizing reaching a predetermined shooting effect, just light filling is carried out according to the target light filling corresponding with the face orientation of destination object before can taking pictures or the while of taking pictures, make in the target image obtained like this, surround lighting around destination object reaches perfect condition, therefore once just can complete and make customer satisfaction system, high-quality photo, eliminate that comparison film after taking pictures carries out adjusting, processing procedure, improve the efficiency obtaining high-quality photos.
In disclosure embodiment, as shown in figure 15, can should comprise towards identification module 11: reference picture obtains submodule 111 and image procossing submodule 112, wherein,
Reference picture acquisition submodule 111 is configured to obtain and carries out taking the reference picture in front described terminal view finder to described destination object;
Image procossing submodule 112 is configured to carry out image procossing to described reference picture, determines the face orientation of destination object in described reference picture according to processing result image.
In disclosure embodiment, as shown in figure 16, above-mentioned image procossing submodule 112 can comprise: feature recognin module 21, face recognition submodule 22 and towards determining submodule 23, wherein:
Feature recognin module 21 is configured to carry out feature identification to described reference picture, judges whether include facial characteristics in described reference picture;
Face recognition submodule 22 is configured to, when including facial characteristics in described reference picture, carry out face recognition to the facial characteristics in described reference picture, obtains face recognition result;
Towards determining that submodule 23 is configured to determine according to described face recognition result the face orientation of described destination object.
In disclosure embodiment, as shown in figure 17, above-mentioned towards determining that submodule 23 can comprise: characteristic parameter extraction submodule 231, calculating sub module 232 and first determine submodule 233.
This characteristic parameter extraction submodule 231 is configured to the characteristic parameter extracting face in described facial characteristics, and described characteristic parameter at least comprises: the perspective size of face;
This calculating sub module 232 is configured to the ratio in the calculation of characteristic parameters facial characteristics according to face in described facial characteristics between face;
This first determines that submodule 233 is configured to, according to the ratio in described facial characteristics between face, determine the face orientation of described destination object.
In disclosure embodiment, as shown in figure 18, above-mentioned towards determining that submodule 23 can comprise: position relationship determination submodule 234 and second determines submodule 235.
This position relationship determination submodule 234 is configured to extract the position relationship in described facial characteristics between hair and face;
This second determines that submodule 235 is configured to, according to the position relationship between described hair and face, determine the face orientation of described destination object.
In disclosure embodiment, the face orientation of destination object is the direction of the relatively described terminal of described destination object, and the direction of illumination of described default light compensating lamp is the direction of the relatively described terminal of described default light compensating lamp.And as shown in figure 19, above-mentioned light filling direction determination module 12 can comprise: corresponding relation obtains submodule 121 and searches submodule 122.
This corresponding relation obtains the mapping table that submodule 121 is configured to the direction of illumination obtaining the face orientation described terminal relative to described default light compensating lamp preset;
This searches the direction of illumination that submodule 122 is configured to search in described mapping table according to the face orientation of described destination object the relatively described terminal of described default light compensating lamp.
Above-mentioned submodule 122 of searching can comprise: the 3rd determines that submodule, the 4th determines that submodule and the 5th determines submodule.
3rd determines that submodule is configured to when the face orientation of described destination object is the direction, left of described terminal, determines that the direction of illumination of described default light compensating lamp is the direction, upper left side of described terminal;
4th determines that submodule is configured to when the face orientation of described destination object is the direction, right of described terminal, determines that the direction of illumination of described default light compensating lamp is the direction, upper right side of described terminal;
5th determines that submodule is configured to, when the face orientation of described destination object is the direction of described terminal faced by forward, determine that the direction of illumination of described default light compensating lamp is the direction, top of described terminal.
In disclosure embodiment, in described mapping table, a corresponding multiple direction of illumination of face orientation.Further, as shown in figure 20, above-mentioned light filling direction determination module 12, also comprises: position probing submodule 123 and choose submodule 124.
This position probing submodule 123 is configured to the light source point position of the multiple direction of illuminations difference correspondences detecting the relatively described terminal of the described default light compensating lamp found;
This is chosen submodule 124 and is configured to choose the direction of illumination of the direction of illumination of light source point position outside the face of described destination object as described default light compensating lamp.
In another embodiment of the disclosure, as shown in figure 21, above-mentioned shooting control module 13 can comprise: direction of illumination controls submodule 131 and the first shooting submodule 132.
Direction of illumination control submodule 131 is configured to control described default light compensating lamp and irradiates according to described direction of illumination, to change the surround lighting around described destination object;
First shooting submodule 132 is configured to, when described default light compensating lamp photograph irradiates according to described direction of illumination, control described terminal and take described destination object, obtain the target image of described destination object.
In another embodiment of the disclosure, the face orientation of destination object is the direction of the relatively described reference picture of described destination object, described target light filling direction is the simulation direction of the relatively described reference picture of simulated light, as shown in figure 22, above-mentioned shooting control module 13, comprising: simulated light generates submodule 133 and the second shooting submodule 134.
Simulated light generates submodule 133 and is configured to generate the target simulation light that direction of illumination and described simulation direction match;
Second shooting submodule 134 is configured to when described terminal is taken described destination object, the image that shooting obtains adds described target simulation light, obtains the target image of described destination object.
In addition, as shown in Figure 23, above-mentioned simulated light generates submodule 133 and can comprise ginseng: generate submodule 1331 and target simulation light chooser module 1332 with reference to simulated light.
Generate submodule 1331 with reference to simulated light to be configured to generate multiple reference simulated light according to predetermined manner on described simulation direction, the position of multiple simulated light source point with reference to simulated light is different;
Target simulation light chooser module 1332 is configured to not be positioned on described destination object face one with reference to simulated light as target simulation light multiple with reference to choosing simulated light source location in simulated light.
About the device in above-described embodiment, wherein the concrete mode of modules executable operations has been described in detail in about the embodiment of the method, will not elaborate explanation herein.
Figure 24 is the structural representation of a kind of terminal 800 according to an exemplary embodiment.Such as, terminal 800 can be mobile phone, computer, digital broadcast terminal, messaging devices, game console, flat-panel devices, Medical Devices, body-building equipment, personal digital assistant etc.
With reference to Figure 24, terminal 800 can comprise following one or more assembly: processing components 802, memory 804, power supply module 806, multimedia groupware 808, audio-frequency assembly 810, the interface 812 of I/O (I/O), sensor cluster 814, and communications component 816.
The integrated operation of the usual control terminal 800 of processing components 802, such as with display, call, data communication, camera operation and record operate the operation be associated.Processing components 802 can comprise one or more processor 820 to perform instruction, to complete all or part of step of above-mentioned method.In addition, processing components 802 can comprise one or more module, and what be convenient between processing components 802 and other assemblies is mutual.Such as, processing components 802 can comprise multi-media module, mutual with what facilitate between multimedia groupware 808 and processing components 802.
Memory 804 is configured to store various types of data to be supported in the operation of terminal 800.The example of these data comprises for any application program of operation in terminal 800 or the instruction of method, contact data, telephone book data, message, picture, video etc.Memory 804 can be realized by the volatibility of any type or non-volatile memory device or their combination, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, disk or CD.
The various assemblies that power supply module 806 is terminal 800 provide electric power.Power supply module 806 can comprise power-supply management system, one or more power supply, and other and the assembly generating, manage and distribute electric power for terminal 800 and be associated.
Multimedia groupware 808 is included in the screen providing an output interface between described terminal 800 and user.In certain embodiments, screen can comprise liquid crystal display (LCD) and touch panel (TP).If screen comprises touch panel, screen may be implemented as touch-screen, to receive the input signal from user.Touch panel comprises one or more touch sensor with the gesture on sensing touch, slip and touch panel.Described touch sensor can the border of not only sensing touch or sliding action, but also detects the duration relevant to described touch or slide and pressure.In certain embodiments, multimedia groupware 808 comprises a front-facing camera and/or post-positioned pick-up head.When terminal 800 is in operator scheme, during as screening-mode or video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each front-facing camera and post-positioned pick-up head can be fixing optical lens systems or have focal length and optical zoom ability.
Audio-frequency assembly 810 is configured to export and/or input audio signal.Such as, audio-frequency assembly 810 comprises a microphone (MIC), and when terminal 800 is in operator scheme, during as call model, logging mode and speech recognition mode, microphone is configured to receive external audio signal.The audio signal received can be stored in memory 804 further or be sent via communications component 816.In certain embodiments, audio-frequency assembly 810 also comprises a loud speaker, for output audio signal.
I/O interface 812 is for providing interface between processing components 802 and peripheral interface module, and above-mentioned peripheral interface module can be keyboard, some striking wheel, button etc.These buttons can include but not limited to: home button, volume button, start button and locking press button.
Sensor cluster 814 comprises one or more transducer, for providing the state estimation of various aspects for terminal 800.Such as, sensor cluster 814 can detect the opening/closing state of terminal 800, the relative positioning of assembly, such as described assembly is display and the keypad of terminal 800, the position of all right sense terminals 800 of sensor cluster 814 or terminal 800 1 assemblies changes, the presence or absence that user contacts with terminal 800, the variations in temperature of terminal 800 orientation or acceleration/deceleration and terminal 800.Sensor cluster 814 can comprise proximity transducer, be configured to without any physical contact time detect near the existence of object.Sensor cluster 814 can also comprise optical sensor, as CMOS or ccd image sensor, for using in imaging applications.In certain embodiments, this sensor cluster 814 can also comprise acceleration transducer, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communications component 816 is configured to the communication being convenient to wired or wireless mode between terminal 800 and other equipment.Terminal 800 can access the wireless network based on communication standard, as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communications component 816 receives from the broadcast singal of external broadcasting management system or broadcast related information via broadcast channel.In one exemplary embodiment, described communications component 816 also comprises near-field communication (NFC) module, to promote junction service.Such as, can based on radio-frequency (RF) identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, terminal 800 can be realized, for performing said method by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components.
In the exemplary embodiment, additionally provide a kind of non-transitory computer-readable recording medium comprising instruction, such as, comprise the memory 804 of instruction, above-mentioned instruction can perform said method by the processor 820 of terminal 800.Such as, described non-transitory computer-readable recording medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage devices etc.
A kind of non-transitory computer-readable recording medium, when the instruction in described storage medium is performed by the processor of terminal, make terminal can perform a kind of filming control method, described method comprises:
Identify the face orientation of the destination object of described terminal taking;
According to the face orientation determination target light filling direction of described destination object;
Light filling is carried out in described target light filling direction, obtains the target image of described destination object.
Those skilled in the art, at consideration specification and after putting into practice invention disclosed herein, will easily expect other embodiment of the present invention.The application is intended to contain any modification of the present invention, purposes or adaptations, and these modification, purposes or adaptations are followed general principle of the present invention and comprised the undocumented common practise in the art of the disclosure or conventional techniques means.Specification and embodiment are only regarded as exemplary, and true scope of the present invention and spirit are pointed out by claim below.
Should be understood that, the present invention is not limited to precision architecture described above and illustrated in the accompanying drawings, and can carry out various amendment and change not departing from its scope.Scope of the present invention is only limited by appended claim.

Claims (25)

1. a filming control method, is applied in terminal, it is characterized in that, described in comprise:
Identify the face orientation of the destination object of described terminal taking;
According to the face orientation determination target light filling direction of described destination object;
Light filling is carried out in described target light filling direction, obtains the target image of described destination object.
2. method according to claim 1, is characterized in that, the face orientation of the destination object of the described terminal taking of described identification, comprising:
Obtain and carry out taking the reference picture in front described terminal view finder to described destination object;
Image procossing is carried out to described reference picture, determines the face orientation of destination object in described reference picture according to processing result image.
3. method according to claim 2, is characterized in that, describedly carries out image procossing to described reference picture, determines the face orientation of destination object in described reference picture, comprising according to processing result image:
Feature identification is carried out to described reference picture, judges whether include facial characteristics in described reference picture;
When including facial characteristics in described reference picture, face recognition being carried out to the facial characteristics in described reference picture, obtaining face recognition result;
The face orientation of described destination object is determined according to described face recognition result.
4. method according to claim 3, is characterized in that, the described face orientation determining described destination object according to described face recognition result, comprising:
Extract the characteristic parameter of face in described facial characteristics, described characteristic parameter at least comprises: the perspective size of face;
According to the ratio in the calculation of characteristic parameters facial characteristics of face in described facial characteristics between face;
According to the ratio in described facial characteristics between face, determine the face orientation of described destination object.
5. method according to claim 3, is characterized in that, the described face orientation determining described destination object according to described face recognition result, comprising:
Extract the position relationship between hair and face in described facial characteristics;
According to the position relationship between described hair and face, determine the face orientation of described destination object.
6. method according to claim 1, is characterized in that, the face orientation of described destination object is the direction of the relatively described terminal of described destination object, and described target light filling direction is the direction of the relatively described terminal of default light compensating lamp.
7. method according to claim 6, is characterized in that, the described face orientation determination target light filling direction according to described destination object, comprising:
Obtain the mapping table of the direction of illumination of the face orientation described terminal relative to described default light compensating lamp preset;
In described mapping table, the direction of illumination of the relatively described terminal of described default light compensating lamp is searched according to the face orientation of described destination object;
The direction of illumination of relatively described for the described default light compensating lamp found terminal is defined as target light filling direction.
8. method according to claim 7, is characterized in that, the described face orientation according to described destination object searches the direction of illumination of the relatively described terminal of described default light compensating lamp in described mapping table, comprising:
When the face orientation of described destination object is the direction, left of described terminal, determine that the direction of illumination of described default light compensating lamp is the direction, upper left side of described terminal;
When the face orientation of described destination object is the direction, right of described terminal, determine that the direction of illumination of described default light compensating lamp is the direction, upper right side of described terminal;
When the face orientation of described destination object is the direction of described terminal faced by forward, determine that the direction of illumination of described default light compensating lamp is the direction, top of described terminal.
9. method according to claim 7, is characterized in that, in described mapping table, and a corresponding multiple direction of illumination of face orientation;
The described face orientation determination target light filling direction according to described destination object, also comprises:
Detect the light source point position that multiple direction of illuminations of the relatively described terminal of described default light compensating lamp found are corresponding respectively;
Choose the direction of illumination of the direction of illumination of light source point position outside the face of described destination object as described default light compensating lamp.
10. the method according to any one of claim 6-9, is characterized in that, describedly on described target light filling direction, carries out light filling, obtains the target image of described destination object, comprising:
Control described default light compensating lamp to irradiate according to described direction of illumination, to change the surround lighting around described destination object;
At described default light compensating lamp according to when irradiating according to described direction of illumination, control described terminal and described destination object is taken, obtain the target image of described destination object.
11. methods according to claim 1, is characterized in that, the face orientation of described destination object is the direction of the relatively described reference picture of described destination object, and described target light filling direction is the simulation direction of the relatively described reference picture of simulated light.
12. methods according to claim 11, is characterized in that, describedly on described target light filling direction, carry out light filling, obtain the target image of described destination object, comprising:
The target simulation light that generation direction of illumination and described simulation direction match;
When described terminal is taken described destination object, the image that shooting obtains adds described target simulation light, obtains the target image of described destination object.
13. methods according to claim 12, is characterized in that, the target simulation light that described generation direction of illumination and described simulation direction match, and comprising:
Described simulation direction generates multiple with reference to simulated light according to predetermined manner, the position of multiple simulated light source point with reference to simulated light is different;
On described destination object face one is not positioned at reference to simulated light as target simulation light with reference to choosing simulated light source location in simulated light multiple.
14. 1 kinds of imaging control devices, are applied in terminal, it is characterized in that, described device comprises:
Towards identification module, for identifying the face orientation of the destination object of described terminal taking;
Light filling direction determination module, for the face orientation determination target light filling direction according to described destination object;
Shooting control module, for carrying out light filling on described target light filling direction, obtains the target image of described destination object.
15. devices according to claim 14, is characterized in that, describedly comprise towards identification module:
Reference picture obtains submodule, carries out taking the reference picture in front described terminal view finder to described destination object for obtaining;
Image procossing submodule, for carrying out image procossing to described reference picture, determines the face orientation of destination object in described reference picture according to processing result image.
16. devices according to claim 15, is characterized in that, described image procossing submodule, comprising:
Feature recognin module, for carrying out feature identification to described reference picture, judges whether include facial characteristics in described reference picture;
Face recognition submodule, for when including facial characteristics in described reference picture, carrying out face recognition to the facial characteristics in described reference picture, obtaining face recognition result;
Towards determining submodule, for determining the face orientation of described destination object according to described face recognition result.
17. devices according to claim 16, is characterized in that, described towards determining submodule, comprising:
Characteristic parameter extraction submodule, for extracting the characteristic parameter of face in described facial characteristics, described characteristic parameter at least comprises: the perspective size of face;
Calculating sub module, for according to the ratio in the calculation of characteristic parameters facial characteristics of face in described facial characteristics between face;
First determines submodule, for according to the ratio in described facial characteristics between face, determines the face orientation of described destination object.
18. devices according to claim 16, is characterized in that, described towards determining submodule, comprising:
Position relationship determination submodule, for extracting the position relationship in described facial characteristics between hair and face;
Second determines submodule, for according to the position relationship between described hair and face, determines the face orientation of described destination object.
19. devices according to claim 14, is characterized in that, the face orientation of described destination object is the direction of the relatively described terminal of described destination object, and described target light filling direction is the direction of the relatively described terminal of default light compensating lamp;
Described light filling direction determination module, comprising:
Corresponding relation obtains submodule, for obtaining the mapping table of the direction of illumination of default face orientation described terminal relative to described default light compensating lamp;
Search submodule, for searching the direction of illumination of the relatively described terminal of described default light compensating lamp in described mapping table according to the face orientation of described destination object;
Submodule is determined in light filling direction, for the direction of illumination of relatively described for the described default light compensating lamp found terminal is defined as target light filling direction.
20. devices according to claim 19, is characterized in that, described in search submodule, comprising:
3rd determines submodule, for when the face orientation of described destination object is the direction, left of described terminal, determines that the direction of illumination of described default light compensating lamp is the direction, upper left side of described terminal;
4th determines submodule, for when the face orientation of described destination object is the direction, right of described terminal, determines that the direction of illumination of described default light compensating lamp is the direction, upper right side of described terminal;
5th determines submodule, for being forward when the face orientation of described destination object when the direction of described terminal, determines that the direction of illumination of described default light compensating lamp is the direction, top of described terminal.
21. devices according to claim 19, is characterized in that, in described mapping table, and a corresponding multiple direction of illumination of face orientation;
Described light filling direction determination module, also comprises:
Position probing submodule, the light source point position that the multiple direction of illuminations for detecting the relatively described terminal of the described default light compensating lamp found are corresponding respectively;
Choose submodule, for choosing the direction of illumination of the direction of illumination of light source point position outside the face of described destination object as described default light compensating lamp.
22. devices according to any one of claim 19-21, it is characterized in that, described shooting control module, comprising:
Direction of illumination controls submodule, irradiates for controlling described default light compensating lamp, to change the surround lighting around described destination object according to described direction of illumination;
First shooting submodule, when irradiating according to described direction of illumination for shining at described default light compensating lamp, controlling described terminal and taking described destination object, obtaining the target image of described destination object.
23. devices according to claim 14, is characterized in that, the face orientation of described destination object is the direction of the relatively described reference picture of described destination object, and described target light filling direction is the simulation direction of the relatively described reference picture of simulated light;
Described shooting control module, comprising:
Simulated light generates submodule, for generating the target simulation light that direction of illumination and described simulation direction match;
Second shooting submodule, for when described terminal is taken described destination object, the image that shooting obtains adds described target simulation light, obtains the target image of described destination object.
24. devices according to claim 23, is characterized in that, described simulated light generates submodule, comprising:
Generate submodule with reference to simulated light, multiple with reference to simulated light for generating according to predetermined manner on described simulation direction, the position of multiple simulated light source point with reference to simulated light is different;
Target simulation light chooser module, for not being positioned on described destination object face one with reference to simulated light as target simulation light multiple with reference to choosing simulated light source location in simulated light.
25. 1 kinds of terminals, is characterized in that, comprising:
Processor;
For the memory of storage of processor executable instruction;
Wherein, described processor is configured to:
Identify the face orientation of the destination object of described terminal taking;
According to the face orientation determination target light filling direction of described destination object;
Light filling is carried out in described target light filling direction, obtains the target image of described destination object.
CN201410776081.4A 2014-12-15 2014-12-15 Filming control method and device Active CN104580886B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410776081.4A CN104580886B (en) 2014-12-15 2014-12-15 Filming control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410776081.4A CN104580886B (en) 2014-12-15 2014-12-15 Filming control method and device

Publications (2)

Publication Number Publication Date
CN104580886A true CN104580886A (en) 2015-04-29
CN104580886B CN104580886B (en) 2018-10-12

Family

ID=53095984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410776081.4A Active CN104580886B (en) 2014-12-15 2014-12-15 Filming control method and device

Country Status (1)

Country Link
CN (1) CN104580886B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408536A (en) * 2016-09-14 2017-02-15 北京小米移动软件有限公司 Image synthesis method and device
CN107241535A (en) * 2017-05-26 2017-10-10 北京小米移动软件有限公司 Flash lamp adjusting means and terminal device
CN107295223A (en) * 2016-04-04 2017-10-24 佳能株式会社 Camera system, light-emitting device, light-emitting control method and storage medium
WO2018014816A1 (en) * 2016-07-22 2018-01-25 捷开通讯(深圳)有限公司 Smart flash lamp control method and mobile terminal
CN107657252A (en) * 2017-09-19 2018-02-02 成都折衍科技有限公司 The vein fingerprint face composite identification system of brightness can be automatically controlled
CN107948539A (en) * 2015-04-30 2018-04-20 广东欧珀移动通信有限公司 A kind of flash lamp control method and terminal
CN108055402A (en) * 2017-12-21 2018-05-18 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN108259817A (en) * 2016-12-28 2018-07-06 南宁富桂精密工业有限公司 Picture shooting system and method
CN108881735A (en) * 2016-03-01 2018-11-23 皇家飞利浦有限公司 adaptive light source
CN109040612A (en) * 2018-08-29 2018-12-18 百度在线网络技术(北京)有限公司 Image processing method, device, equipment and the storage medium of target object
CN109314751A (en) * 2018-08-30 2019-02-05 深圳市锐明技术股份有限公司 A kind of light compensation method, light compensating apparatus and electronic equipment
CN109525773A (en) * 2017-09-20 2019-03-26 卡西欧计算机株式会社 Photographic device, image capture method and recording medium

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1992820A (en) * 2005-12-27 2007-07-04 三星Techwin株式会社 Digital camera with face detection function for facilitating exposure compensation
US20070230933A1 (en) * 2006-03-28 2007-10-04 Fujifilm Corporation Device and method for controlling flash
CN101102408A (en) * 2006-07-07 2008-01-09 奥林巴斯映像株式会社 Camera and image processing method of camera
CN101399914A (en) * 2007-09-28 2009-04-01 富士胶片株式会社 Image capture device and image capture method
CN101561872A (en) * 2008-04-16 2009-10-21 奥林巴斯映像株式会社 Image processing apparatus
CN101588443A (en) * 2009-06-22 2009-11-25 费炜 Statistical device and detection method for television audience ratings based on human face
CN201639677U (en) * 2010-05-28 2010-11-17 天津三星光电子有限公司 Digital camera provided with two rotary flash lamps
CN101916370A (en) * 2010-08-31 2010-12-15 上海交通大学 Method for processing non-feature regional images in face detection
CN102238335A (en) * 2010-04-30 2011-11-09 奥林巴斯映像株式会社 Photographic device and image data generating method
CN102332185A (en) * 2011-08-17 2012-01-25 中国铁道科学研究院电子计算技术研究所 L-type passage used for security inspection area face recognition
CN102385692A (en) * 2010-08-31 2012-03-21 中国科学院深圳先进技术研究院 Human face deflection image acquiring system and method
CN102426757A (en) * 2011-12-02 2012-04-25 上海大学 Safety driving monitoring system based on mode identification and method thereof
CN102426646A (en) * 2011-10-24 2012-04-25 西安电子科技大学 Multi-angle human face detection device and method
CN102466945A (en) * 2010-11-19 2012-05-23 北京海鑫智圣技术有限公司 LED supplementary lighting and image clipping evaluation system in standard image acquisition device
CN102761705A (en) * 2011-04-25 2012-10-31 奥林巴斯映像株式会社 An image recording device, an image editing device and an image capturing device
US20130156276A1 (en) * 2011-12-14 2013-06-20 Hon Hai Precision Industry Co., Ltd. Electronic device with a function of searching images based on facial feature and method
CN103353760A (en) * 2013-04-25 2013-10-16 上海大学 Device and method for adjusting wall-mounted display interface capable of adapting to any face directions
CN103543575A (en) * 2012-07-10 2014-01-29 宏碁股份有限公司 Image acquisition device and light source assisted photographing method
CN103546668A (en) * 2012-07-09 2014-01-29 联想(北京)有限公司 Image acquisition device, electronic equipment and auxiliary shooting method

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1992820A (en) * 2005-12-27 2007-07-04 三星Techwin株式会社 Digital camera with face detection function for facilitating exposure compensation
US20070230933A1 (en) * 2006-03-28 2007-10-04 Fujifilm Corporation Device and method for controlling flash
CN101102408A (en) * 2006-07-07 2008-01-09 奥林巴斯映像株式会社 Camera and image processing method of camera
CN101399914A (en) * 2007-09-28 2009-04-01 富士胶片株式会社 Image capture device and image capture method
CN101561872A (en) * 2008-04-16 2009-10-21 奥林巴斯映像株式会社 Image processing apparatus
CN101588443A (en) * 2009-06-22 2009-11-25 费炜 Statistical device and detection method for television audience ratings based on human face
CN102238335A (en) * 2010-04-30 2011-11-09 奥林巴斯映像株式会社 Photographic device and image data generating method
CN201639677U (en) * 2010-05-28 2010-11-17 天津三星光电子有限公司 Digital camera provided with two rotary flash lamps
CN101916370A (en) * 2010-08-31 2010-12-15 上海交通大学 Method for processing non-feature regional images in face detection
CN102385692A (en) * 2010-08-31 2012-03-21 中国科学院深圳先进技术研究院 Human face deflection image acquiring system and method
CN102466945A (en) * 2010-11-19 2012-05-23 北京海鑫智圣技术有限公司 LED supplementary lighting and image clipping evaluation system in standard image acquisition device
CN102761705A (en) * 2011-04-25 2012-10-31 奥林巴斯映像株式会社 An image recording device, an image editing device and an image capturing device
CN102332185A (en) * 2011-08-17 2012-01-25 中国铁道科学研究院电子计算技术研究所 L-type passage used for security inspection area face recognition
CN102426646A (en) * 2011-10-24 2012-04-25 西安电子科技大学 Multi-angle human face detection device and method
CN102426757A (en) * 2011-12-02 2012-04-25 上海大学 Safety driving monitoring system based on mode identification and method thereof
US20130156276A1 (en) * 2011-12-14 2013-06-20 Hon Hai Precision Industry Co., Ltd. Electronic device with a function of searching images based on facial feature and method
CN103546668A (en) * 2012-07-09 2014-01-29 联想(北京)有限公司 Image acquisition device, electronic equipment and auxiliary shooting method
CN103543575A (en) * 2012-07-10 2014-01-29 宏碁股份有限公司 Image acquisition device and light source assisted photographing method
CN103353760A (en) * 2013-04-25 2013-10-16 上海大学 Device and method for adjusting wall-mounted display interface capable of adapting to any face directions

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107948539A (en) * 2015-04-30 2018-04-20 广东欧珀移动通信有限公司 A kind of flash lamp control method and terminal
CN107948539B (en) * 2015-04-30 2020-07-17 Oppo广东移动通信有限公司 Flash lamp control method and terminal
US11803104B2 (en) 2015-11-10 2023-10-31 Lumileds Llc Adaptive light source
US11223777B2 (en) 2015-11-10 2022-01-11 Lumileds Llc Adaptive light source
US11184552B2 (en) 2015-11-10 2021-11-23 Lumileds Llc Adaptive light source
CN108881735B (en) * 2016-03-01 2021-11-26 皇家飞利浦有限公司 Method and system for selectively illuminating a scene
CN108881735A (en) * 2016-03-01 2018-11-23 皇家飞利浦有限公司 adaptive light source
CN107295223A (en) * 2016-04-04 2017-10-24 佳能株式会社 Camera system, light-emitting device, light-emitting control method and storage medium
US10495947B2 (en) 2016-07-22 2019-12-03 Jrd Communication Inc. Smart flashlight control method and mobile terminal
WO2018014816A1 (en) * 2016-07-22 2018-01-25 捷开通讯(深圳)有限公司 Smart flash lamp control method and mobile terminal
CN106408536A (en) * 2016-09-14 2017-02-15 北京小米移动软件有限公司 Image synthesis method and device
CN108259817A (en) * 2016-12-28 2018-07-06 南宁富桂精密工业有限公司 Picture shooting system and method
CN107241535B (en) * 2017-05-26 2020-10-23 北京小米移动软件有限公司 Flash lamp adjusting device and terminal equipment
CN107241535A (en) * 2017-05-26 2017-10-10 北京小米移动软件有限公司 Flash lamp adjusting means and terminal device
CN107657252A (en) * 2017-09-19 2018-02-02 成都折衍科技有限公司 The vein fingerprint face composite identification system of brightness can be automatically controlled
CN109525773A (en) * 2017-09-20 2019-03-26 卡西欧计算机株式会社 Photographic device, image capture method and recording medium
CN108055402A (en) * 2017-12-21 2018-05-18 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN109040612B (en) * 2018-08-29 2020-07-28 百度在线网络技术(北京)有限公司 Image processing method, device and equipment of target object and storage medium
CN109040612A (en) * 2018-08-29 2018-12-18 百度在线网络技术(北京)有限公司 Image processing method, device, equipment and the storage medium of target object
CN109314751A (en) * 2018-08-30 2019-02-05 深圳市锐明技术股份有限公司 A kind of light compensation method, light compensating apparatus and electronic equipment
CN109314751B (en) * 2018-08-30 2021-01-12 深圳市锐明技术股份有限公司 Light supplementing method, light supplementing device and electronic equipment

Also Published As

Publication number Publication date
CN104580886B (en) 2018-10-12

Similar Documents

Publication Publication Date Title
CN104580886A (en) Photographing control method and device
WO2016011747A1 (en) Skin color adjustment method and device
CN106408603B (en) Shooting method and device
CN104219448B (en) Image pickup method and device
CN104506772A (en) Method and device for regulating shooting parameters
CN104639843A (en) Method and device for processing image
CN104243818A (en) Image processing method and device and image processing equipment
CN105302315A (en) Image processing method and device
CN105405427A (en) Facility brightness adjustment method and device
CN103927165A (en) Wallpaper picture processing method and device
CN103914150A (en) Camera control method and device
CN106600530B (en) Picture synthesis method and device
CN105513104A (en) Picture taking method, device and system
CN104700353A (en) Image filter generating method and device
CN105554389A (en) Photographing method and photographing apparatus
WO2022110837A1 (en) Image processing method and device
CN106506948A (en) Flash lamp control method and device
CN104933419A (en) Method and device for obtaining iris images and iris identification equipment
CN105100634A (en) Image photographing method and image photographing device
CN104156993A (en) Method and device for switching face image in picture
CN105208284A (en) Photographing reminding method and device
CN104867112A (en) Photo processing method and apparatus
CN105516588A (en) Photographic processing method and device
CN105357449A (en) Shooting method and device, and image processing method and apparatus
CN105203456A (en) Plant species identification method and apparatus thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant