CN108965697A - A kind of filming control method, terminal and computer readable storage medium - Google Patents
A kind of filming control method, terminal and computer readable storage medium Download PDFInfo
- Publication number
- CN108965697A CN108965697A CN201810691947.XA CN201810691947A CN108965697A CN 108965697 A CN108965697 A CN 108965697A CN 201810691947 A CN201810691947 A CN 201810691947A CN 108965697 A CN108965697 A CN 108965697A
- Authority
- CN
- China
- Prior art keywords
- face image
- area
- image
- shooting
- brightness
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
Abstract
The invention discloses a kind of filming control method, terminal and computer readable storage mediums, filming control method, terminal and computer readable storage medium, in shooting process, it finds a view from shooting and acquires face image in picture, when collecting at least two face images, it is shown to be group photo scene at this time, determines and shoots whether each face image in picture of finding a view meets default shooting condition, only just control is shot when determining that each face image meets default shooting condition;The invention also discloses a kind of terminal and computer readable storage mediums, face image by implementing each subject in the image that above-mentioned image pickup method can make shooting obtain meets default shooting condition, guarantee the shooting effect of each subject, the resultant effect of the image obtained when to promote group photo, and then promote the satisfaction of user experience.
Description
Technical field
The present invention relates to field of terminal, more specifically to a kind of filming control method, terminal and computer-readable deposit
Storage media.
Background technique
Camera function is one of function the most frequent used in the current various intelligent terminals with shooting function.And
Group photo is also than more typical scene of taking pictures.Often make in friend's party, corporate activity, the household various scenes such as travel outdoors
Use group photo function.In group photo, since number is more, the more difficult synchronization of expression in shooting process between different people, and not
The picture-taking position stood with people is also not necessarily reasonable, and the picture effect for causing shooting to obtain is poor.For example, when shooting, having
People may cause in the obtained photo of shooting in blink, the eyes of some people are closures;In another example when shooting, having
People may stand it is earlier cause face in the obtained picture of shooting very big, somebody stand again it is distant cause to shoot obtain
Picture in face it is smaller;The picture septum reset that the position light at somebody station secretly causes very much shooting to obtain is too dark, somebody
Face in the too bright picture for causing shooting to obtain of the position light stood is too bright etc., and the various factors of above example can all cause to close
Bad according to obtained picture effect, satisfaction of users is poor.
Summary of the invention
The technical problem to be solved in the present invention is that: it is existing that asking for obtained group photo image effect difference of taking a group photo is clapped by terminal
Topic provides a kind of filming control method, terminal and computer readable storage medium.
In order to solve the above technical problems, the present invention provides a kind of filming control method, the filming control method includes:
It finds a view from shooting and acquires face image in picture;
When collecting at least two face images, determine whether each face image meets default shooting condition;
If so, control is shot.
Optionally, whether each face image of the determination meets default shooting condition and includes:
Extract the ocular image on each face image;
Judged in each face image according to the ocular image of extraction, if there are the face that eyes are eye closing figures
Picture such as exists, and determines that eyes do not meet default shooting condition for the face image of eye closing.
Optionally, the face image is facial image, and whether each face image of the determination meets default bat
The condition of taking the photograph includes:
Calculate the respective region area of each face image;
Region area and the unmatched face image of predeterminable area area threshold are judged whether there is, is such as existed, determining should
Part face image does not meet default shooting condition.
Optionally, the face image is facial image, and whether each face image of the determination meets default bat
The condition of taking the photograph includes:
Calculate the respective average image brightness value of each face image;
Average image brightness value and the unmatched face image of predetermined luminance threshold value are judged whether there is, is such as existed, is determined
The part face image does not meet default shooting condition.
Optionally, the predeterminable area area threshold includes in maximum region area threshold and Minimum Area area threshold
It is at least one;
It is described to judge whether there is region area with the unmatched face image of predeterminable area area threshold and include:
When the predeterminable area area threshold includes maximum region area threshold, it is each that each face image is obtained respectively
From region area and the maximum region area threshold difference, when there are a certain region area be greater than the maximum region
When area threshold, determine that the region area and the maximum region area threshold mismatch;
When the predeterminable area area threshold includes Minimum Area area threshold, it is each that each face image is obtained respectively
From region area and the Minimum Area area threshold difference, when there are a certain region area be less than the Minimum Area
When area threshold, determine that the region area and the Minimum Area area threshold mismatch.
Optionally, when judging not meeting shooting condition there are face image, further includes:
The face image for not meeting shooting condition is obtained in the position shot in picture of finding a view;
Shooting distance tune is carried out to the subject for shooting picture corresponding position of finding a view according to the position identified
Whole prompting.
Optionally, the predetermined luminance threshold value includes at least one of maximum brightness threshold value and minimum brightness threshold value;
It is described to judge whether there is average image brightness value with the unmatched face image of predetermined luminance threshold value and include:
When the predetermined luminance threshold value includes maximum brightness threshold value, it is respective average that each face image is obtained respectively
The difference of image brightness values and the maximum brightness threshold value, when there are a certain average image brightness values to be greater than the maximum brightness threshold
When value, determine that the average image brightness value and the maximum brightness threshold value mismatch;
When the predetermined luminance threshold value includes minimum brightness threshold value, it is respective average that each face image is obtained respectively
The difference of image brightness values and the minimum brightness threshold value, when there are a certain average image brightness values to be less than the minimum brightness threshold
When value, determine that the average image brightness value and the minimum brightness threshold value mismatch.
Optionally, when judging not meeting shooting condition there are face image, further includes:
The face image for not meeting shooting condition is obtained in the position shot in picture of finding a view;
Shooting brightness tune is carried out to the subject for shooting picture corresponding position of finding a view according to the position identified
Whole prompting.
Further, the present invention also provides a kind of terminal, the terminal includes that processor, memory and communication are total
Line;
The communication bus is for realizing the connection communication between the processor and memory;
The processor is as described above to realize for executing one or more program stored in the memory
The step of filming control method.
Further, the present invention also provides a kind of computer readable storage medium, the computer readable storage mediums
It is stored with one or more program, one or more of programs can be executed by one or more processor, to realize
The step of filming control method as described above.
Beneficial effect
Filming control method, terminal and computer readable storage medium provided by the invention, in shooting process, from shooting
It finds a view and acquires face image in picture, when collecting at least two face images, be shown to be group photo scene at this time, determine shooting
Whether each face image in picture of finding a view meets default shooting condition, is only determining that each face image meets
Just control is shot when default shooting condition;The face of each subject in the image that shooting can be made to obtain in this way
Portion's image all meets default shooting condition, guarantees the shooting effect of each subject, obtains when to promoting group photo
The resultant effect of image, and then promote the satisfaction of user experience.
Detailed description of the invention
Present invention will be further explained below with reference to the attached drawings and examples, in attached drawing:
The hardware structural diagram of Fig. 1 each embodiment one optional mobile terminal to realize the present invention;
The electrical structure schematic diagram of Fig. 2 each embodiment one optional camera to realize the present invention;
Fig. 3 is the filming control method flow diagram that first embodiment of the invention provides;
Fig. 4 be first embodiment of the invention provide judge whether each subject shows all in the process of eyes-open state
It is intended to;
Fig. 5 is the whether suitable process of the shooting distance for judging each subject that first embodiment of the invention provides
Schematic diagram;
Fig. 6 is whether the shooting environmental brightness for judging each subject that first embodiment of the invention provides is all suitable
Flow diagram;
Fig. 7 is the filming control method flow diagram that second embodiment of the invention provides;
Fig. 8 is the flow diagram for generating shooting distance and adjusting reminder message that second embodiment of the invention provides;
Fig. 9 is the flow diagram for the generation shooting brightness adjustment reminder message that second embodiment of the invention provides;
Figure 10 is the terminal structure schematic diagram that third embodiment of the invention provides;
Figure 11 is the filming control method flow diagram that third embodiment of the invention provides.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In subsequent description, it is only using the suffix for indicating such as " module ", " component " or " unit " of element
Be conducive to explanation of the invention, itself there is no a specific meaning.Therefore, " module ", " component " or " unit " can mix
Ground uses.
Terminal can be implemented in a variety of manners.For example, terminal described in the present invention may include such as mobile phone, plate
Computer, laptop, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable
Media player (Portable Media Player, PMP), navigation device, wearable device, Intelligent bracelet, pedometer etc. move
The fixed terminals such as dynamic terminal, and number TV, desktop computer.
It will be illustrated by taking mobile terminal as an example in subsequent descriptions, it will be appreciated by those skilled in the art that in addition to special
Except element for moving purpose, the construction of embodiment according to the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, a kind of hardware structural diagram of its mobile terminal of each embodiment to realize the present invention, the shifting
Dynamic terminal 100 may include: RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit
103, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit
108, the components such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1
Mobile terminal structure does not constitute the restriction to mobile terminal, and mobile terminal may include components more more or fewer than diagram,
Perhaps certain components or different component layouts are combined.
It is specifically introduced below with reference to all parts of the Fig. 1 to mobile terminal:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, signal sends and receivees, specifically, by base station
Downlink information receive after, to processor 110 handle;In addition, the data of uplink are sent to base station.In general, radio frequency unit 101
Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetrating
Frequency unit 101 can also be communicated with network and other equipment by wireless communication.Any communication can be used in above-mentioned wireless communication
Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications
System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code
Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division
Multiple Access, wideband code division multiple access), TD-SCDMA (Time Division-Synchronous Code
Division Multiple Access, TD SDMA), FDD-LTE (Frequency Division
Duplexing-Long Term Evolution, frequency division duplex long term evolution) and TDD-LTE (Time Division
Duplexing-Long Term Evolution, time division duplex long term evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch electricity by WiFi module 102
Sub- mail, browsing webpage and access streaming video etc., it provides wireless broadband internet access for user.Although Fig. 1 shows
Go out WiFi module 102, but it is understood that, and it is not belonging to must be configured into for mobile terminal, it completely can be according to need
It to omit within the scope of not changing the essence of the invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100
When under the isotypes such as formula, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is received or
The audio data that person stores in memory 109 is converted into audio signal and exports to be sound.Moreover, audio output unit
103 can also provide executed to mobile terminal 100 the relevant audio output of specific function (for example, call signal receive sound,
Message sink sound etc.).Audio output unit 103 may include loudspeaker, buzzer etc..
A/V input unit 104 is for receiving audio or video signal.A/V input unit 104 may include graphics processor
(Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode
Or the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries out
Reason.Treated, and picture frame may be displayed on display unit 106.Through graphics processor 1041, treated that picture frame can be deposited
Storage is sent in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike
Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042
Quiet down sound (audio data), and can be audio data by such acoustic processing.Audio that treated (voice) data can
To be converted to the format output that can be sent to mobile communication base station via radio frequency unit 101 in the case where telephone calling model.
Microphone 1042 can be implemented various types of noises elimination (or inhibition) algorithms and send and receive sound to eliminate (or inhibition)
The noise generated during frequency signal or interference.
Mobile terminal 100 further includes at least one sensor 105, such as optical sensor, motion sensor and other biographies
Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 1061, and proximity sensor can close when mobile terminal 100 is moved in one's ear
Display panel 1061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (general
For three axis) size of acceleration, it can detect that size and the direction of gravity when static, can be used to identify the application of mobile phone posture
(such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion)
Deng;Fingerprint sensor, pressure sensor, iris sensor, molecule sensor, the gyroscope, air pressure that can also configure as mobile phone
The other sensors such as meter, hygrometer, thermometer, infrared sensor, details are not described herein.
Display unit 106 is for showing information input by user or being supplied to the information of user.Display unit 106 can
Including display panel 1061, liquid crystal display (Liquid Crystal Display, LCD), organic light-emitting diodes can be used
Forms such as (Organic Light-Emitting Diode, OLED) are managed to configure display panel 1061.One in the present embodiment
Multiple display panels that terminal can have, each display surface sheet of front and back, or two display panels of front setting, and its
In display panel can use the ink screen of low-power consumption.
User input unit 107 can be used for receiving the number or character information of input, and generate the use with mobile terminal
Family setting and the related key signals input of function control.Specifically, user input unit 107 may include touch panel 1071
And other input equipments 1072.Touch panel 1071, also referred to as touch screen collect the touch behaviour of user on it or nearby
Make (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 1071 or in touch panel
Operation near 1071), and corresponding attachment device is driven according to preset formula.Touch panel 1071 may include touching
Two parts of detection device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch behaviour
Make bring signal, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and
It is converted into contact coordinate, then gives processor 110, and order that processor 110 is sent can be received and executed.This
Outside, touch panel 1071 can be realized using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves.In addition to touching
Panel 1071 is controlled, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072
It can include but is not limited to physical keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, operation
One of bar etc. is a variety of, specifically herein without limitation.
Further, touch panel 1071 can cover display panel 1061, when touch panel 1071 detect on it or
After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing
The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, touch panel 1071 and display panel
1061 be the function that outputs and inputs of realizing mobile terminal as two independent components, but in certain embodiments, it can
The function that outputs and inputs of mobile terminal is realized so that touch panel 1071 and display panel 1061 is integrated, is not done herein specifically
It limits.
Interface unit 108 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example,
External device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, wired or nothing
Line data port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving from external device (ED) input (for example,
Data information, electric power etc.) and the input received is transferred to one or more elements in mobile terminal 100 or can
For transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storing software program and various data.Memory 109 can mainly include storing program area
The storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one function
Sound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such as
Audio data, phone directory etc.) etc..In addition, memory 109 may include high-speed random access memory, it can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection
A part by running or execute the software program and/or module that are stored in memory 109, and calls and is stored in storage
Data in device 109 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.Place
Managing device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulatedemodulate is mediated
Manage device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is main
Processing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Mobile terminal 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111
Can be logically contiguous by power-supply management system and processor 110, to realize management charging by power-supply management system, put
The functions such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 100 can also include bluetooth module, camera model etc..
The electrical structure schematic diagram of Fig. 2 each embodiment one optional camera model to realize the present invention.
Phtographic lens 1211 is made of the multiple optical lens for being used to form shot object image, wherein phtographic lens 1211 can
Think single-focus lens or zoom lens.Phtographic lens 1211 can be in the direction of the optical axis under the control of lens driver 1221
Mobile, lens driver 1221 controls phtographic lens 1211 according to the control signal from lens driving control circuit 1222
Focal position also can control focal length in the case where zoom lens.Lens driving control circuit 1222 is according to from miniature
The control command of computer 1217 carries out the drive control of lens driver 1221, and lens driving control circuit 1222 can also be according to
Control command from controller 180, processor, microcontroller, microprocessor carries out drive control.
It is configured with and takes the photograph near the position of the shot object image formed on the optical axis of phtographic lens 1211, by phtographic lens 1211
Element 1212.Photographing element 1212 is for imaging shot object image and obtaining image data.On photographing element 1212
Two dimension and be arranged in a matrix the photodiode for constituting each pixel.Each photodiode generates photoelectricity corresponding with light income
Switching current, the photoelectric conversion electric current carry out charge accumulation by the capacitor connecting with each photodiode.The preceding table of each pixel
Face is configured with the RGB colour filter of bayer arrangement.
Photographing element 1212 is connect with imaging circuit 1213, which carries out electricity in photographing element 1212
Lotus accumulation control and picture signal read control, and it is laggard to reduce resetting noise to the picture signal (analog picture signal) of reading
Row waveform shaping, and then gain raising etc. is carried out to become signal level appropriate.
Imaging circuit 1213 is connect with A/D converter 1214, which carries out modulus to analog picture signal
Conversion, to 1227 output digital image signal (hereinafter referred to as image data) of bus.
Bus 1227 is the transmitting path for being transmitted in the various data that the inside of camera reads or generates.In bus
1227 are connected to above-mentioned A/D converter 1214, are additionally connected to image processor 1215, JPEG processor 1216, microcomputer
Calculation machine 1217, SDRAM (Synchronous Dynamic random access memory, Synchronous Dynamic Random Access Memory)
1218, memory interface (hereinafter referred to as memory I/F) 1219, LCD (Liquid Crystal Display, liquid crystal display
Device) driver 1220.
Image processor 1215 carries out OB to the image data of the output based on photographing element 1212 and subtracts each other processing, white balance
Adjustment, gamma conversion, colour difference signal processing, noise removal process, changes processing, edge processing etc. respectively at color matrix operation simultaneously
Kind image procossing.Jpeg processor 1216 is when by Imagery Data Recording in recording medium 1225, according to JPEG compression mode pressure
Contract the image data read from SDRAM1218.In addition, jpeg processor 1216 shows to carry out image reproducing and carries out JPEG
The decompression of image data.When unziping it, the file being recorded in recording medium 1225 is read, in jpeg processor 1216
In implement decompression after, the image data of decompression is temporarily stored in SDRAM1218 and enterprising in LCD1226
Row display.In addition, in the present embodiment, as compression of images decompression mode using JPEG mode, however compressing solution
Compress mode is without being limited thereto, it is of course possible to using MPEG, TIFF, other compressed and decompressed modes such as H.264.
Microcomputer 1217 plays the function of the control unit as camera entirety, is uniformly controlled the various processing of camera
Sequence.Microcomputer 1217 is connected to operating unit 1223 and flash memory 1224.
Operating unit 1223 includes but is not limited to physical button or virtual key, and the entity or virtual key can be electricity
Source button, camera button, edit key, dynamic image button, reproduction button, menu button, cross key, OK button, delete button,
The operational controls such as the various input buttons such as large buttons and various enter keys, detect the mode of operation of these operational controls,.
It will test result to export to microcomputer 1217.In addition, the front surface in the LCD1226 as display is equipped with
Touch panel detects the touch location of user, which is exported to microcomputer 1217.Microcomputer 1217
According to the testing result of the operating position from operating unit 1223, various processing sequences corresponding with the operation of user are executed.
Flash memory 1224 stores the program for executing the various processing sequences of microcomputer 1217.Microcomputer 1217
The control of camera entirety is carried out according to the program.In addition, flash memory 1224 stores the various adjusted values of camera, microcomputer 1217
Adjusted value is read, the control of camera is carried out according to the adjusted value.
SDRAM1218 be for image data etc. temporarily stored can electricity rewrite volatile memory.
SDRAM1218 temporarily stores the image data exported from A/D converter 1214 and in image processor 1215, jpeg processor
1216 it is equal in the image data that carried out that treated.
Memory interface 1219 is connect with recording medium 1225, is carried out image data and the text being attached in image data
First-class control data write-in recording medium 1225 and read from recording medium 1225 of part.Recording medium 1225 is, for example, can
The recording mediums such as memory card of disassembled and assembled freely on camera main-body, however it is without being limited thereto, it is also possible to be built in camera main-body
In hard disk etc..
LCD driver 1220 is connect with LCD1226, will treated that image data is stored in by image processor 1215
SDRAM1218 when needing to show, reads the image data of SDRAM1218 storage and shows on LCD1226, alternatively, JPEG
The compressed image data of processor 1216 is stored in SDRAM1218, and when needing to show, jpeg processor 1216 is read
The compressed image data of SDRAM1218, then unzip it, the image data after decompression is carried out by LCD1226
Display.
LCD1226 configuration performs image display at the back side of camera main-body.LCD display panel can be used in the LCD1226, so
And it is without being limited thereto, the various display panels (LCD1226) such as organic EL can also be used, however it is without being limited thereto, can also use has
The various display panels such as machine EL.
Electrical structure based on above-mentioned mobile terminal hardware configuration and camera, proposition camera detecting device of the present invention,
Method and each embodiment of multi-cam terminal.
Hereinafter, being illustrated by specific embodiment to filming control method of the present invention, terminal and storage medium etc..
First embodiment
Filming control method provided in this embodiment may be implemented in group photo photographed scene, only find a view in shooting every in picture
Just control is shot when one face image all meets default shooting condition, to guarantee that each is taken pair during group photo
The shooting effect of elephant, to promote the resultant effect of group photo image.A kind of filming control method provided in this embodiment is referring to Fig. 3
It is shown, comprising:
S301: it finds a view from shooting and acquires face image in picture.
In the present embodiment, when acquiring face image from the picture of finding a view obtained when shooting, it can be schemed using various faces
As the specific face recognition algorithms that the identification that identification method carries out face image is acquired, therefore used in the present embodiment carry out face
The identification process of image is no longer repeated.And the face image in the present embodiment can be the face image of face, it can also be with
It is the face image of animal.Such as the face image of the various animals such as cat, dog, bird.That is, in the present embodiment, as pet master
When people wants to take a group photo with pet, filming control method provided in this embodiment can also be used, so that the image that shooting obtains
The effect of middle owner and pet can be protected.
S302: when collecting at least two face images, determine whether each face image meets default shooting
Condition, if so, going to S303;Otherwise, S301 is gone to.
It in the present embodiment, is found a view after collecting face image in picture from shooting, that is, can determine whether that the face got schemes
The number of picture then shows to be currently not group photo scene when for a face image;When for more than two face images,
Then show to be currently more than two subject group photos.
In addition, in the present embodiment, in order to avoid the occasion more than tourist attraction et al., user wants individually shooting, but
The case where finding a view the face image for collecting other people that can be difficult to avoid that in picture and leading to erroneous judgement.Bat is being determined in the present embodiment
Whether when the face image for absorbing in scape picture is more than two, can generate in display interface is the confirmation choosing of photographed scene of taking a group photo
?.User can carry out selection determination according to current concrete scene at this time, and if not group photo scene, then user may be selected to determine
It is not group photo scene;Otherwise, user may be selected to be group photo photographed scene.
S303: control is shot.
In the present embodiment, when control is shot, corresponding voice prompting sound can be transmitted, also to prompt subject to infuse
Meaning, to obtain better shooting effect as far as possible.
In the present embodiment when only one facial information in viewfinder image, then normal photographing mode can be used and clapped
Take the photograph, in the present embodiment depending on normal photographing include using it is various it is non-group photo scenes styles of shooting shot;Certainly, it can also adopt
The shooting of single object (such as one) is carried out with filming control method provided in this embodiment.
The filming control method provided through this embodiment can be in this multiple quilt in multiple subjects group photo
When the face image of reference object all meets default shooting condition, just shot.Default shooting condition in the present embodiment can
Flexibly to be set according to concrete application scene, and optionally, the default shooting condition can be made by oneself by user in some embodiments
Justice setting, and user's dynamic can be supported to update;It is of course also possible to be arranged by terminal producer, application developer, and can be supported user
It updates.For example, in the present embodiment, whether can all be opened from each subject eyes, shooting distance whether all suitable, institute
Locate at least one of whether all suitable etc. factors of light of position default shooting condition is arranged, and it should be understood that originally
Which specifically can flexibly be set using because default shooting condition is usually arranged in embodiment.
In order to make it easy to understand, the present embodiment is illustrated separately below with above-mentioned three kinds of example factors.
When default shooting condition includes whether each subject eyes all open, determines shoot in picture of finding a view at this time
Each face image whether to meet default shooting condition shown in Figure 4, comprising:
S401: the ocular image on collected each face image is extracted.
In the present embodiment, the identification method that ocular image is extracted in identification from face image can use various images
Recognizer, details are not described herein.
S402: the ocular image extracted is analyzed.
The purpose analyzed in the present embodiment ocular image is to judge whether the eyes of subject are in
Closed-eye state, avoiding subject in the image of shooting is closed-eye state.
It, can be in conjunction with front when analyze and determine with the presence or absence of eye closing phenomenon to ocular image in the present embodiment
The difference of ocular in ocular and current image frame is corresponded in picture frame to determine variable condition, and then determines whether to deposit
In eye closing situation.
In some instances, it can also analyze and identify eyeball image, and then determine size shared by eyeball image,
Determine whether that there are eye closing situations according to eyeball image occupied area size.
S403: based on the analysis results as judging in each face image, if there are the face images that eyes are eye closing, such as deposit
S404 is being gone to, otherwise, is going to S405.
S404: determine that eyes do not meet default shooting condition for the face image of eye closing.
S405: determine each face image meet default shooting condition or only determine eye judgement pass through, in conjunction with other because
The judging result of element is determined.
In shooting process, if subject be too near to camera or with the distance between camera too far, namely
Shooting distance is too close or too far, and the image effect shot is all undesirable.In order to make it easy to understand, the present embodiment is with face image
To be illustrated for facial image, at this time determine shooting find a view each of picture face image whether all meet it is pre-
If shooting condition is shown in Figure 5, comprising:
S501: the respective region area of each face image is calculated.
In the present embodiment can by area shared by pixel number shared by each face image and each pixel,
The region area of the gross area shared by each face image namely the face image in the present embodiment is calculated.
S502: judging whether there is region area and the unmatched face image of predeterminable area area threshold, such as exists, and turns
Otherwise S504 is gone to S503.
For region area and the unmatched face image of predeterminable area area threshold, show the corresponding quilt of face's object
Reference object is too far away or too close apart from camera.
S503: determine that the part face image does not meet shooting condition.
S504: determine that each face image meets default shooting condition or only determines that shooting distance judgement passes through, in conjunction with it
He is determined the judging result of factor.
In shooting process, it if the ambient light of subject present position is too light or too dark, shoots
Image effect is also all undesirable.It is said in order to make it easy to understand, the present embodiment still carries out example so that face image is facial image as an example
It is bright, determine that shooting is found a view each of picture face image shown in Figure 6, packet that whether all meets default shooting condition at this time
It includes:
S601: the respective average image brightness value of each face image is calculated.
It can be bright as the average image by calculating the average brightness value of each pixel shared by face image in the present embodiment
Angle value.
S602: judging whether there is average image brightness value and the unmatched face image of predetermined luminance threshold value, such as exists,
Go to S603;Otherwise, S604 is gone to.
S603: determine that the part face image does not meet shooting condition.
S604: determine that each face image meets default shooting condition or only determines that shooting light judgement passes through, in conjunction with it
He is determined the judging result of factor.
Filming control method provided in this embodiment can respectively to be taken when taking a group photo for multiple subjects
The eyes of object are all in eyes-open state, the shooting distance apart from camera is all moderate and/or the light luminance of institute's station location all
It is moderate etc., to guarantee to shoot each subject effect in obtained image, and then promote the satisfaction of user experience.
Second embodiment:
In order to make it easy to understand, the present embodiment is illustrated below with reference to a kind of specific camera application example.In this implementation
In example, default shooting condition includes that whether all suitable, present position the whether all suitable two kinds of factors of light of shooting distance are to show
Example is illustrated.But it is to be understood that can also only include whether all suitable, present position the light of shooting distance all closes
Any one moderate factor, or may also be combined with or be separately provided whether subject eyes all open this factor.This reality
Example is applied to take pictures as example, shooting control process at this time is shown in Figure 7, comprising:
S701: it in shooting process, finds a view from shooting and acquires face image in picture.
S702: when collecting at least two face images, the respective region area of each face image is calculated.
S703: judging whether there is region area and the unmatched face image of predeterminable area area threshold, such as exists, and turns
To S704;Otherwise, S705 is gone to.
S704: determining that the part face image does not meet shooting condition, optionally, produces corresponding distance adjustment and reminds
Message reminds user.
S705: the respective average image brightness value of each face image is calculated.
S706: judging whether there is average image brightness value and the unmatched face image of predetermined luminance threshold value, such as exists,
Go to S707;Otherwise, S708 is gone to.
S707: determining that the part face image does not meet shooting condition, optionally, produces corresponding brightness adjustment and reminds
Message reminds user.
S708: determine to determine that each face image meets default shooting condition, control is shot.
It is first to judge whether the shooting distance of each subject is all suitable, then sentence in filming control method shown in Fig. 7
Break each subject present position light it is whether all suitable.It should be understood that can also first judge each subject institute
Whether the light for locating position is all suitable, then judges whether the shooting distance of each subject is all suitable.Or subject
Shooting distance and the light of each subject present position whether all suitable judged simultaneously;Or certain has been judged one by one
After whether the shooting distance of one subject and the light of present position are all suitable, then judge next subject
Whether shooting distance and the light of present position are all suitable.Namely specific judgement sequence can flexibly be set.
In shooting process, hypertelorism or excessively close effect of the subject apart from camera will not all be got well.This implementation
In example can only for too far the case where controlled, can also be controlled only for excessively close situation, or for too far and
It crosses close while being controlled.Therefore, in the present embodiment, predeterminable area area threshold may include maximum region area threshold and most
At least one of zonule area threshold;
Judging whether there is region area and the unmatched face image of predeterminable area area threshold in S703 at this time can wrap
It includes:
When predeterminable area area threshold includes maximum region area threshold, the respective region of each face image is obtained respectively
The difference of area and maximum region area threshold, when being greater than maximum region area threshold there are a certain region area, determining should
Region area and maximum region area threshold mismatch;Show that the shooting distance of subject is too small at this time, namely apart from mistake
Closely;
When predeterminable area area threshold includes Minimum Area area threshold, the respective region of each face image is obtained respectively
The difference of area and Minimum Area area threshold, when being less than Minimum Area area threshold there are a certain region area, determining should
Region area and Minimum Area area threshold mismatch, and show that the shooting distance of subject is excessive at this time, namely apart from mistake
Far.
In the present embodiment, it when generating distance adjustment reminder message in S704, can targetedly remind, such as which
The hypertelorism of the subject of position is excessively close, consequently facilitating corresponding subject is fast and accurately adjusted,
The process for generating distance adjustment reminder message in S704 at this time is shown in Figure 8, comprising:
S801: the face image that acquisition does not meet shooting condition is shooting the position in picture of finding a view.
S802: shooting distance tune is carried out to the subject for shooting picture corresponding position of finding a view according to the position identified
Whole prompting.
In shooting process, the excessive lightness or darkness effect of the light of subject also will not all be got well.It can be in the present embodiment
Controlled, can also be controlled only for excessively dark situation only for excessively bright situation, or be directed to it is bright and excessively it is dark simultaneously
It is controlled.Therefore, in the present embodiment, predetermined luminance threshold value may include in maximum brightness threshold value and minimum brightness threshold value extremely
Few one kind;
At this point, judging whether there is average image brightness value and the unmatched face image packet of predetermined luminance threshold value in S706
It includes:
When predetermined luminance threshold value includes maximum brightness threshold value, the respective average image brightness of each face image is obtained respectively
The difference of value and maximum brightness threshold value determines when there are the difference that a certain average image brightness value is greater than maximum brightness threshold value
The average image brightness value and maximum brightness threshold value mismatch, and show that the subject present position brightness is excessively bright at this time;
When predetermined luminance threshold value includes minimum brightness threshold value, the respective average image brightness of each face image is obtained respectively
The difference of value and minimum brightness threshold value determines that this is average when being less than minimum brightness threshold value there are a certain average image brightness value
Image brightness values and minimum brightness threshold value mismatch, and show the subject present position too dark brightness at this time.
In the present embodiment, when generating corresponding brightness adjustment reminder message in S707 and reminding user, Ke Yiyou
It targetedly reminds, reminder process at this time is shown in Figure 9, comprising:
S901: the face image that acquisition does not meet shooting condition is shooting the position in picture of finding a view.
S902: shooting brightness tune is carried out to the subject for shooting picture corresponding position of finding a view according to the position identified
Whole prompting.
The reminder message generated in the present embodiment can be speech message, be also possible to word message, or include simultaneously
Voice messaging and text information;It can certainly be other kinds of message, as long as user can be showed effectively to be reminded.
In addition, it is to be understood that the value of above-mentioned each threshold value in the present embodiment can be flexible according to concrete application scene
Setting, and can support the customized setting of user and/or update, it can also support to learn automatically, behavior learning is shot according to user and is arrived
The preferred brightness of user or shooting distance etc..
Exemplified filming control method through this embodiment, can make shooting distance and the brightness of each subject
It is just shot when all moderate, to guarantee the effect for each subject that shooting obtains;And detecting a certain be taken
When the distance of object is improper and/or present position ambient brightness is improper, intelligent reminding is carried out in order to which subject is fast
Speed, accurately adjustment, can further promote the satisfaction of user experience.
3rd embodiment
The present embodiment additionally provides a kind of terminal, which can be the mobile terminal with shooting function, wearable end
End equipment can also be the immobile terminal with shooting function, shown in Figure 10 comprising processor 1001, memory
1002 and communication bus 1003;
Communication bus 1003 is for realizing the connection communication between processor 1001 and memory 1002;
Processor 1001 is for executing one or more program stored in memory 1002, to realize upper each embodiment
The step of exemplified filming control method.
The present embodiment additionally provides a kind of computer readable storage medium, which can be applied to respectively
In kind electric terminal, it is stored with one or more program, which can be handled by one or more
Device executes, the step of to realize as above filming control method shown in each embodiment.
In order to make it easy to understand, the present embodiment is illustrated below with reference to a kind of specific more people's group photo camera application examples.
In the present embodiment, default shooting condition includes whether all suitable, present position the light of shooting distance is all suitable, is clapped
It takes the photograph object eye and whether all opens these three factors and be illustrated for example.But it is to be understood that can also only include above-mentioned three
Any one in kind of factor or two kinds.Filming control method in the present embodiment is shown in Figure 11, comprising:
S1101: it in shooting process, finds a view from shooting and acquires face image in picture.
S1102: when collecting at least two face images, the respective region area of each face image is calculated.
S1103: judging whether there is region area and the unmatched face image of predeterminable area area threshold, such as exists,
Go to S1104;Otherwise, S1105 is gone to.
S1104: determining that the part face image does not meet shooting condition, optionally, produces corresponding distance adjustment and mentions
Message of waking up reminds user.
S1105: the respective average image brightness value of each face image is calculated.
S1106: average image brightness value and the unmatched face image of predetermined luminance threshold value are judged whether there is, is such as deposited
Going to S1107;Otherwise, S1108 is gone to.
S1107: determining that the part face image does not meet shooting condition, optionally, produces corresponding brightness adjustment and mentions
Message of waking up reminds user.
S1108: the ocular image on collected each face image is extracted.
S1109: the ocular image extracted is analyzed.
S1110: based on the analysis results as judging in each face image, if there are the face images that eyes are eye closing, such as
In the presence of going to S1111, otherwise, go to S1112.
S1111: determine that eyes do not meet default shooting condition for the face image of eye closing.
S1112: determine to determine that each face image meets default shooting condition, control is shot.
In filming control method shown in Figure 11, be first judge whether the shooting distance of each subject all suitable, then
Judge whether the light of each subject present position is all suitable, finally judges whether subject eyes all open.But
It should be understood that the judgement sequence between these three factors can flexibly be set.
Exemplified filming control method through this embodiment, can make shooting distance and the brightness of each subject
All moderate and eyes are just shot all in when opening state, to guarantee the effect for each subject that shooting obtains;
And it is improper in the distance for detecting a certain subject and/or when present position ambient brightness is improper, it carries out intelligence and mentions
It wakes up in order to which subject fast and accurately adjusts, can further promote the satisfaction of user experience.
It should be noted that, in this document, term " includes ", " comprising " or its any other variant are intended to non-row
His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
There is also other identical elements in the process, method of element, article or device.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that an intelligent wearable device (can be mobile phone, count
Calculation machine, server, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific
Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art
Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much
Form, all of these belong to the protection of the present invention.
Claims (10)
1. a kind of filming control method, which is characterized in that the filming control method includes:
It finds a view from shooting and acquires face image in picture;
When collecting at least two face images, determine whether each face image meets default shooting condition;
If so, control is shot.
2. filming control method as described in claim 1, which is characterized in that whether each face image of the determination accords with
Closing default shooting condition includes:
Extract the ocular image on each face image;
Judged in each face image according to the ocular image of extraction, if there are eyes be close one's eyes face image,
If existed, determine that eyes do not meet default shooting condition for the face image of eye closing.
3. filming control method as claimed in claim 1 or 2, which is characterized in that the face image is facial image, described
Determine whether each face image meets default shooting condition and include:
Calculate the respective region area of each face image;
Region area and the unmatched face image of predeterminable area area threshold are judged whether there is, such as exists, determines the part
Face image does not meet default shooting condition.
4. filming control method as claimed in claim 1 or 2, which is characterized in that the face image is facial image, described
Determine whether each face image meets default shooting condition and include:
Calculate the respective average image brightness value of each face image;
Average image brightness value and the unmatched face image of predetermined luminance threshold value are judged whether there is, such as exists, determines the portion
Face image is divided not meet default shooting condition.
5. filming control method as claimed in claim 3, which is characterized in that the predeterminable area area threshold includes maximum area
At least one of domain area threshold and Minimum Area area threshold;
It is described to judge whether there is region area with the unmatched face image of predeterminable area area threshold and include:
When the predeterminable area area threshold includes maximum region area threshold, it is respective that each face image is obtained respectively
The difference of region area and the maximum region area threshold, when there are a certain region area and greater than the maximum region area
When threshold value, determine that the region area and the maximum region area threshold mismatch;
When the predeterminable area area threshold includes Minimum Area area threshold, it is respective that each face image is obtained respectively
The difference of region area and the Minimum Area area threshold, when there are a certain region area and less than the Minimum Area area
When threshold value, determine that the region area and the Minimum Area area threshold mismatch.
6. filming control method as claimed in claim 3, which is characterized in that judge not meeting shooting item there are face image
When part, further includes:
The face image for not meeting shooting condition is obtained in the position shot in picture of finding a view;
Shooting distance adjustment is carried out to the subject for shooting picture corresponding position of finding a view according to the position identified to mention
It wakes up.
7. filming control method as claimed in claim 4, which is characterized in that the predetermined luminance threshold value includes maximum brightness threshold
At least one of value and minimum brightness threshold value;
It is described to judge whether there is average image brightness value with the unmatched face image of predetermined luminance threshold value and include:
When the predetermined luminance threshold value includes maximum brightness threshold value, the respective the average image of each face image is obtained respectively
The difference of brightness value and the maximum brightness threshold value, when there are a certain average image brightness values to be greater than the maximum brightness threshold value
When, determine that the average image brightness value and the maximum brightness threshold value mismatch;
When the predetermined luminance threshold value includes minimum brightness threshold value, the respective the average image of each face image is obtained respectively
The difference of brightness value and the minimum brightness threshold value, when there are a certain average image brightness values to be less than the minimum brightness threshold value
When, determine that the average image brightness value and the minimum brightness threshold value mismatch.
8. filming control method as claimed in claim 4, which is characterized in that judge not meeting shooting item there are face image
When part, further includes:
The face image for not meeting shooting condition is obtained in the position shot in picture of finding a view;
Shooting brightness adjustment is carried out to the subject for shooting picture corresponding position of finding a view according to the position identified to mention
It wakes up.
9. a kind of terminal, which is characterized in that the terminal includes processor, memory and communication bus;
The communication bus is for realizing the connection communication between the processor and memory;
The processor is for executing one or more program stored in the memory, to realize such as claim 1-8
Any one of described in filming control method the step of.
10. a kind of computer readable storage medium, which is characterized in that the computer-readable recording medium storage have one or
Multiple programs, one or more of programs can be executed by one or more processor, to realize as in claim 1-8
The step of described in any item filming control methods.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810691947.XA CN108965697A (en) | 2018-06-28 | 2018-06-28 | A kind of filming control method, terminal and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810691947.XA CN108965697A (en) | 2018-06-28 | 2018-06-28 | A kind of filming control method, terminal and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108965697A true CN108965697A (en) | 2018-12-07 |
Family
ID=64488076
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810691947.XA Pending CN108965697A (en) | 2018-06-28 | 2018-06-28 | A kind of filming control method, terminal and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108965697A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110365905A (en) * | 2019-07-25 | 2019-10-22 | 北京迈格威科技有限公司 | Automatic photographing method and device |
CN110933293A (en) * | 2019-10-31 | 2020-03-27 | 努比亚技术有限公司 | Shooting method, terminal and computer readable storage medium |
CN112153275A (en) * | 2019-06-28 | 2020-12-29 | 青岛海信移动通信技术股份有限公司 | Photographing terminal and image selection method thereof |
CN114303366A (en) * | 2019-09-06 | 2022-04-08 | 索尼集团公司 | Information processing apparatus, information processing method, and information processing program |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8964066B2 (en) * | 2009-07-15 | 2015-02-24 | Samsung Electronics Co., Ltd | Apparatus and method for generating image including multiple people |
CN105430281A (en) * | 2015-12-24 | 2016-03-23 | 广东欧珀移动通信有限公司 | Picture taking control method, picture taking control device, selfie stick and camera system |
CN105791685A (en) * | 2016-02-29 | 2016-07-20 | 广东欧珀移动通信有限公司 | Control method, control device and electronic device |
CN105827969A (en) * | 2016-03-29 | 2016-08-03 | 乐视控股(北京)有限公司 | Intelligent camera method and device, and mobile device |
CN105959563A (en) * | 2016-06-14 | 2016-09-21 | 北京小米移动软件有限公司 | Image storing method and image storing apparatus |
CN106375665A (en) * | 2016-09-23 | 2017-02-01 | 深圳市金立通信设备有限公司 | Photographing method and terminal |
CN107124543A (en) * | 2017-02-20 | 2017-09-01 | 维沃移动通信有限公司 | A kind of image pickup method and mobile terminal |
CN108174095A (en) * | 2017-12-28 | 2018-06-15 | 努比亚技术有限公司 | Photographic method, mobile terminal and computer-readable medium based on smiling face's identification |
-
2018
- 2018-06-28 CN CN201810691947.XA patent/CN108965697A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8964066B2 (en) * | 2009-07-15 | 2015-02-24 | Samsung Electronics Co., Ltd | Apparatus and method for generating image including multiple people |
CN105430281A (en) * | 2015-12-24 | 2016-03-23 | 广东欧珀移动通信有限公司 | Picture taking control method, picture taking control device, selfie stick and camera system |
CN105791685A (en) * | 2016-02-29 | 2016-07-20 | 广东欧珀移动通信有限公司 | Control method, control device and electronic device |
CN105827969A (en) * | 2016-03-29 | 2016-08-03 | 乐视控股(北京)有限公司 | Intelligent camera method and device, and mobile device |
CN105959563A (en) * | 2016-06-14 | 2016-09-21 | 北京小米移动软件有限公司 | Image storing method and image storing apparatus |
CN106375665A (en) * | 2016-09-23 | 2017-02-01 | 深圳市金立通信设备有限公司 | Photographing method and terminal |
CN107124543A (en) * | 2017-02-20 | 2017-09-01 | 维沃移动通信有限公司 | A kind of image pickup method and mobile terminal |
CN108174095A (en) * | 2017-12-28 | 2018-06-15 | 努比亚技术有限公司 | Photographic method, mobile terminal and computer-readable medium based on smiling face's identification |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112153275A (en) * | 2019-06-28 | 2020-12-29 | 青岛海信移动通信技术股份有限公司 | Photographing terminal and image selection method thereof |
CN112153275B (en) * | 2019-06-28 | 2022-08-05 | 青岛海信移动通信技术股份有限公司 | Photographing terminal and image selection method thereof |
CN110365905A (en) * | 2019-07-25 | 2019-10-22 | 北京迈格威科技有限公司 | Automatic photographing method and device |
CN110365905B (en) * | 2019-07-25 | 2021-08-31 | 北京迈格威科技有限公司 | Automatic photographing method and device |
CN114303366A (en) * | 2019-09-06 | 2022-04-08 | 索尼集团公司 | Information processing apparatus, information processing method, and information processing program |
CN110933293A (en) * | 2019-10-31 | 2020-03-27 | 努比亚技术有限公司 | Shooting method, terminal and computer readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110401766A (en) | A kind of image pickup method and terminal | |
CN108540724A (en) | A kind of image pickup method and mobile terminal | |
CN105959554B (en) | Video capture device and method | |
CN108965697A (en) | A kind of filming control method, terminal and computer readable storage medium | |
CN108234875A (en) | Shoot display methods, device, mobile terminal and storage medium | |
CN109361869A (en) | A kind of image pickup method and terminal | |
CN109831636A (en) | Interdynamic video control method, terminal and computer readable storage medium | |
CN109151180A (en) | A kind of object identifying method and mobile terminal | |
CN110225241A (en) | A kind of video capture control method, terminal and computer readable storage medium | |
CN108184070A (en) | A kind of image pickup method and terminal | |
CN108600647A (en) | Shooting preview method, mobile terminal and storage medium | |
CN107948498B (en) | A kind of elimination camera Morie fringe method and mobile terminal | |
CN109005262A (en) | A kind of camera control method and terminal | |
CN110471606A (en) | Input method and electronic equipment | |
CN108989672A (en) | A kind of image pickup method and mobile terminal | |
CN108833709A (en) | A kind of the starting method and mobile terminal of camera | |
CN108174103A (en) | A kind of shooting reminding method and mobile terminal | |
CN109144361A (en) | A kind of image processing method and terminal device | |
CN108965710A (en) | Method, photo taking, device and computer readable storage medium | |
CN108063859A (en) | A kind of automatic camera control method, terminal and computer storage media | |
CN109639996A (en) | High dynamic scene imaging method, mobile terminal and computer readable storage medium | |
CN110032887A (en) | A kind of picture method for secret protection, terminal and computer readable storage medium | |
CN109729246A (en) | A kind of multi-camera circuit structure, terminal and computer readable storage medium | |
CN108174081B (en) | A kind of image pickup method and mobile terminal | |
CN108881719A (en) | A kind of method and terminal device switching style of shooting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181207 |