CN106250021B - Photographing control method and mobile terminal - Google Patents

Photographing control method and mobile terminal Download PDF

Info

Publication number
CN106250021B
CN106250021B CN201610614806.9A CN201610614806A CN106250021B CN 106250021 B CN106250021 B CN 106250021B CN 201610614806 A CN201610614806 A CN 201610614806A CN 106250021 B CN106250021 B CN 106250021B
Authority
CN
China
Prior art keywords
gesture input
preset
preview interface
image
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610614806.9A
Other languages
Chinese (zh)
Other versions
CN106250021A (en
Inventor
王超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201610614806.9A priority Critical patent/CN106250021B/en
Publication of CN106250021A publication Critical patent/CN106250021A/en
Application granted granted Critical
Publication of CN106250021B publication Critical patent/CN106250021B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The invention provides a photographing control method and a mobile terminal. The control method comprises the following steps: when the mobile terminal is currently in a photographed image preview interface, determining a gesture input area in the image preview interface; detecting whether a preset gesture exists in the gesture input area; and if the preset gesture exists, executing a photographing operation instruction corresponding to the preset gesture. The control method can solve the problem that in the prior art, when the photographing mode is adjusted through photographing from the selfie stick, the mobile terminal needs to be moved close, so that the use is inconvenient.

Description

Photographing control method and mobile terminal
Technical Field
The present invention relates to the field of electronic devices, and in particular, to a method for controlling photographing and a mobile terminal.
Background
Along with the continuous development of science and technology, the cell-phone technique is advanced, and the cell-phone effect of shooing is better and better, and people also more and more like the autodyne, goes out the tourism and also become the requisite. However, when the selfie stick is used for taking a picture, the selfie stick is relatively long, and the adjustment of the picture taking mode is relatively troublesome, for example, when the self-photographing position is selected, the camera needs to be taken to a near switching mode when the picture taking mode is required to be switched, the position of the selfie stick needs to be readjusted when the picture is taken again, the adjustment process of repeated movement causes the picture taking to be relatively troublesome, and inconvenience is brought to the use.
Disclosure of Invention
The technical scheme of the invention aims to provide a photographing control method and a mobile terminal, and solves the problem of inconvenient use caused by the fact that the mobile terminal needs to be moved close when a photographing mode needs to be adjusted during self-photographing in the prior art.
The invention provides a control method for photographing, which is applied to a mobile terminal, wherein the control method comprises the following steps:
when the mobile terminal is currently in a photographed image preview interface, determining a gesture input area in the image preview interface;
detecting whether a preset gesture exists in the gesture input area;
and if the preset gesture exists, executing a photographing operation instruction corresponding to the preset gesture.
Another aspect of the present invention also provides a mobile terminal, wherein the mobile terminal includes:
the area determining module is used for determining a gesture input area in the image preview interface when the mobile terminal is currently in the photographed image preview interface;
the action detection module is used for detecting whether a preset gesture exists in the gesture input area;
and the instruction execution module is used for executing a photographing operation instruction corresponding to the preset gesture if the preset gesture exists.
The photographing control method and the mobile terminal provided by the embodiment of the invention have the following beneficial effects:
when a user takes a picture, for example, when a user takes a self-picture, when the picture taking mode or parameters of the mobile terminal are adjusted, the mobile terminal can execute a picture taking operation instruction to be executed only by making a corresponding preset track action in a gesture input area under an image preview interface without holding the mobile terminal close, so that the operation process of the self-picture taking is simple and convenient, and the problems that the mobile terminal needs to be moved close for multiple times when the self-picture taking is carried out, the picture taking process is complicated, and the picture taking parameters need to be adjusted repeatedly are solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a schematic flow chart of a control method according to a first embodiment of the present invention;
FIG. 2 is a flow chart illustrating a control method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of a display interface displayed by the control method according to the embodiment of the present invention;
FIG. 4 is a flowchart illustrating a control method according to a third embodiment of the present invention;
fig. 5 is a block diagram of a mobile terminal according to a fourth embodiment of the present invention;
fig. 6 is a second block diagram of a mobile terminal according to a fourth embodiment of the present invention;
fig. 7 is a block diagram of a mobile terminal according to a fifth embodiment of the present invention;
fig. 8 is a block diagram of a mobile terminal according to a sixth embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
First embodiment
Referring to fig. 1, a method for controlling photographing according to a first embodiment is applied to a mobile terminal, and the method includes:
step 101, when the mobile terminal is currently in the photographed image preview interface, determining a gesture input area in the image preview interface.
In this step, the gesture input area is a display area capable of shooting gesture input of the user, that is, when the user takes a picture, gesture action input is performed in the gesture input area in the current preview interface.
Specifically, the gesture input area may be a predetermined area in the image preview interface, and before photographing, the predetermined area is determined and stored by a user in advance, that is, the gesture input area during gesture input during photographing can be determined; in addition, the gesture input area can also be an area where the hand of the user is located when the user takes a picture at present, which is automatically detected, and the gesture input area is identified and determined by identifying the hand image information displayed in the image preview interface.
Preferably, the photographing control method of the present invention is applied to operation control during self-photographing, and the image preview interface in step 101 is an image preview interface during a self-photographing mode.
Step 102, detecting whether a preset gesture exists in the gesture input area.
Specifically, the detected gesture includes whether the hand presenting state meets a preset state and whether the motion of the hand state meets a predetermined track motion. The hand presenting state for performing the gesture input may be a state in which one finger, a plurality of fingers or a state in which one finger is clenched as a fist, and the predetermined trajectory motion may be a movement in one direction. The mobile terminal can pre-store image information of preset hand states and information of corresponding preset track actions, can store a plurality of image information of preset hand states and corresponding information of a plurality of preset track actions, and forms a plurality of different preset gestures through different preset hand states and different preset track actions, wherein the different preset gestures correspond to different photographing operation instructions.
And 103, if the preset gesture exists, executing a photographing operation instruction corresponding to the preset gesture.
And when the preset gesture exists in the current image preview interface, determining that a user inputs an instruction for inputting an action for current photographing, and executing a corresponding photographing operation instruction according to the corresponding relation between the preset gesture and the preset operation instruction. For example, when it is detected that the user has made an upward movement motion of extending one finger in the image preview interface, where the one finger is in a hand state of a gesture, and a predetermined trajectory motion is performed in the upward movement motion, the upward movement motion of the one finger is stored in the mobile terminal in advance, and the motion can be used to adjust the exposure parameter of the current self-timer shooting to the increasing direction, that is, when there is a corresponding relationship with the operation instruction for adjusting the output exposure parameter to the increasing direction, the operation instruction for increasing the exposure parameter is executed.
Through the steps 101 to 103, when a user performs self-shooting, particularly uses a selfie stick to perform self-shooting, although the mobile terminal is arranged at one end of the selfie stick far away from the handheld end, when the shooting mode or parameters of the mobile terminal are adjusted, the mobile terminal does not need to be held close, only needs to be held by one hand and make corresponding predetermined track actions by the other hand according to the corresponding relation between the operation instructions to be input and the preset gestures under an image preview interface, and the operation instruction to be executed can be executed by the mobile terminal, so that the operation process of self-shooting is simple and convenient, and the problems that the mobile terminal needs to be moved close for multiple times when self-shooting is performed, the shooting process is complicated, and the shooting parameters need to be adjusted repeatedly are avoided.
Second embodiment
Referring to fig. 2, the method for controlling photographing according to the second embodiment includes:
step 201, when the mobile terminal is currently in the photographed image preview interface, determining a gesture input area in the image preview interface.
In this step, the gesture input area is a display area capable of shooting gesture input of the user, that is, when the user performs gesture input, image information input by the gesture can be displayed in the gesture input area in the current preview interface, so that when the mobile terminal detects whether the current user performs gesture input or not to perform photographing operation, only the gesture input area determined needs to be detected, and an effect of improving the detection speed is achieved.
The gesture input area may be a pre-designated area in a display interface of the mobile terminal.
Preferably, in step 201, the step of determining a gesture input area in the image preview interface includes:
detecting a face image area in an image preview interface;
setting one area out of the face image area in the image preview interface as the gesture input area.
In this way, one of the regions other than the face image region in the image preview interface can be set as the gesture input region.
As shown in fig. 3, since a face image is usually present in the self-timer image, in order to prevent the head image from overlapping with the display portion at the time of gesture input, the gesture input area 1 for gesture input to control the photographing operation in the operation method according to the present invention is set as one of the areas other than the face image area 2.
By adopting the gesture input area determining mode, the face image and the hand image displayed in the image preview interface can be obviously divided into areas, so that the action performed by the hand can be clearly detected, and the effect of increasing the recognition rate is achieved.
Specifically, when detecting a face image region in the image preview interface, the colors of the image displayed in the image preview interface are first distinguished, and a region having a large contrast with skin color, such as a sky region or a clothing region (having a color such as black, red, or blue, and having a large contrast with skin color), is detected, and this region having a large contrast with skin color is set as a gesture input region.
In addition, to further facilitate the distinction of the hand images, it is preferable that one of the regions having a single color other than the face image region in the image preview interface be set as the gesture input region 1.
In the control method according to the present invention, the above-mentioned method for detecting a face image region in an image preview interface should be a technique well known to those skilled in the art, and this part is not a research focus of the present invention and will not be described in detail herein.
Step 202, displaying the boundary line of the gesture input area in an image preview interface. As shown in fig. 3, by displaying the boundary line of the gesture input area 1 thus identified on the image preview interface, the user can adjust the position of the gesture input so that the gesture input can be performed by displaying the boundary line on the gesture input area 1.
And 203, outputting a prompt message for prompting the user to perform gesture input in the gesture input area.
Specifically, as shown in fig. 3, a prompt message "place handle in this area" may be displayed inside the determined gesture input area 1, where the displayed area is a gesture input area for prompting the user to perform a motion input with a hand inside the currently displayed demarcated area to perform a self-timer control operation.
Step 204, detecting whether a preset gesture input exists in the image preview interface in the gesture input area.
That is, whether the preset hand state exists or not is detected to act in a preset track.
Through the above steps 201 to 204, after the gesture input area is determined in the current image preview interface, when it is determined whether the gesture input of the user exists, it is only required to detect whether the preset gesture input exists in the determined gesture input area.
In step 205, if a preset gesture exists, a photographing operation instruction corresponding to the preset gesture is executed.
That is, if the preset hand state acts in the preset track, the photographing operation instruction corresponding to the preset track action of the preset hand state is executed.
When it is determined that a preset track action of a preset hand state exists in a gesture input area determined in a current image preview interface, determining that a user inputs an action input instruction for taking a current self-timer picture, and executing a corresponding picture taking operation instruction according to a corresponding relation between a preset track action of the preset hand state and a preset operation instruction.
In the control method according to the second embodiment, when a user uses a selfie stick to perform self-photographing, the mobile terminal may prompt the user to perform gesture input in an area defined in the image preview interface, so that the mobile terminal can photograph a hand image and make a corresponding predetermined trajectory action in a preset hand state in the gesture input area determined in the current image preview interface, that is, the mobile terminal can execute a photographing operation instruction to be executed, thereby enabling the self-photographing operation process to be simple and convenient.
Third embodiment
A control method according to a third embodiment, referring to fig. 4, the control method includes:
step 401, when the mobile terminal is currently in the photographed image preview interface, determining whether a hand image exists in the image preview interface.
In this step, according to the second embodiment, a gesture input area may be automatically defined in the image preview interface, and the user may be instructed to perform gesture input in the defined gesture input area, and determine whether a hand image exists in the gesture input area.
In addition, a position area where the hand image is located may be determined as the gesture input area by automatically detecting the position of the hand image.
Step 402, detecting a presentation state of a hand displayed in a hand image if the hand image exists.
In this embodiment, when the user performs the action input by using the hand, the state of the hand may be a state in which one finger, two fingers, three fingers, or the like is extended, or a state in which the hand is clenched as a fist, different states are presented by setting the hand in advance, and different operation instructions are predetermined during the self-photographing in the predetermined trajectory action.
Further, by this step, it is determined whether the user is performing gesture input. For example, when a state that a user stretches out one finger is preset as gesture input, when a hand image of the user is detected in the image preview interface as a state that the user stretches out one finger, the user is determined to be performing gesture input; when the state that the user closes the fist is scheduled to be in the gesture input state, when the state that the hand image of the user is in the fist closing state is detected in the image preview interface, the user is determined to be in the gesture input state.
Of course, the hand state predetermined for gesture input may not be limited to one type, and may be respectively used for operation instructions corresponding to different functions. When the hand image exists in the image preview interface, the detected hand image is matched with the pre-stored different hand state image information corresponding to different operation instructions, so that whether gesture input is performed in the image preview interface can be determined.
Step 403, according to the image preview interface, when it is determined that the hand moves in the predetermined state and according to the predetermined trajectory within the predetermined time period, it is determined that the preset gesture exists in the image preview interface.
Through the step, whether the current movement mode of the user for gesture input through the hand meets the preset track action or not is determined, wherein the preset same hand state can comprise a plurality of preset track actions which respectively correspond to different adjustment modes when the same operation function is executed.
When it is determined through the above step 402 that the hand state of the user in the image preview interface conforms to the predetermined state, recording an initial position of the hand in the display interface, determining a movement trajectory of the hand according to a change in position of the hand in the display interface in each frame image within a predetermined time period, comparing the determined movement trajectory information of the hand with information of a plurality of predetermined trajectory actions, and when one of the plurality of predetermined trajectory actions is matched, determining that a preset gesture exists in the image preview interface and the corresponding operation instruction is matched.
And step 404, determining a photographing operation function required to be executed by the operation instruction according to the hand state of the preset gesture.
The photographing operation function refers to an instruction for executing different controls in a current photographing mode. For example, the photographing operation functions include photographing mode switching, exposure parameter adjustment, brightness adjustment, photographing execution, and the like.
Step 405, determining an adjusting direction when the photographing operation function is executed according to the predetermined track motion.
In the third embodiment of the present invention, different predetermined hand states can be preset to correspond to different photographing operation functions, that is, different photographing operation functions can be corresponding to different hand states; and when the same hand state acts along different preset tracks, different preset gestures are formed and are used for executing different adjusting directions under the corresponding photographing operation function.
For example, when the hand is in a fist state, the switching function is used for indicating the mode switching operation when self-shooting is performed, and the switching modes which can be included include normal shooting, beauty, panorama and night scene; when the hand moves upwards in a fist state, the switching sequence for indicating that the switching modes are executed is forward adjustment in sequence; when the hand moves downwards in a fist shape, the switching sequence for indicating that the switching modes are executed is sequentially adjusted backwards.
When the hand is in a state of extending a finger, the operation function for adjusting the exposure parameter when executing the current self-photographing is represented; when the hand moves upwards in a state of extending one finger, the hand is used for indicating that the exposure parameter is adjusted towards the increasing direction; when the hand appears to move downwards in a state of one finger, the method is used for indicating that the exposure parameter is adjusted towards the decreasing direction.
Similarly, according to the above manner of jointly determining through different hand states and different predetermined trajectory actions, the photographing operation instruction to be executed when the gesture is input can be determined.
And step 406, executing the photographing operation instruction according to the determined photographing operation function and the corresponding adjusting direction.
According to the steps, when the action input of the user for self-timer shooting through hand action is determined, the shooting operation instruction required to be executed can be determined according to the determined shooting operation function and the corresponding adjusting direction, so that the corresponding operation instruction is executed.
By adopting the control method of the third embodiment of the invention, when a user uses the selfie stick to carry out self-shoot, and when the shooting mode or parameters of the mobile terminal are adjusted, the mobile terminal can execute the operation instruction to be executed only by holding the selfie stick with one hand and carrying out corresponding gesture input with the other hand under the image preview interface, so that the operation process of self-shoot is simple and convenient.
Preferably, in the first to third embodiments of the control method, when the mobile terminal is currently located in the photographed image preview interface, before determining the gesture input area in the image preview interface, the control method further includes:
displaying a touch button for selecting gesture input in the image preview interface;
and if the fact that the user clicks the touch button is detected, determining a gesture input area in the image preview interface.
Then, detecting whether a preset gesture exists in the gesture input area or not,
and if the preset gesture exists, executing a photographing operation instruction corresponding to the preset gesture.
The method comprises the steps of displaying a touch button for enabling a user to perform gesture input in an image preview interface before determining a gesture input area in the image preview interface, and starting operation control for performing photographing control through gesture input after detecting that the user clicks the touch button, so that the control method is started in response to an instruction of the user, and the function can be closed when not needed, thereby increasing the use flexibility.
Fourth embodiment
The above first to third embodiments respectively describe in detail the control method for photographing by a mobile terminal in different scenes, and the following further describes the mobile terminal corresponding to the control method of the present invention with reference to fig. 6 and 7. In a fifth embodiment of the present invention, the mobile terminal 600 includes:
the area determining module 601 is configured to determine a gesture input area in an image preview interface when the mobile terminal is currently in the photographed image preview interface;
the action detection module 602 is configured to detect whether a preset gesture exists in the gesture input area;
the instruction executing module 603 is configured to, if a preset gesture exists, execute a photographing operation instruction corresponding to the preset gesture.
Through the embodiment, the mobile terminal, when a user takes a self-timer photo, especially when the user takes a self-timer photo by using the self-timer rod, although the mobile terminal is arranged at one end of the self-timer rod away from the handheld end, when the photo mode or parameters of the mobile terminal are adjusted, the mobile terminal does not need to be held close to the handheld end, only under an image preview interface, according to the corresponding relation between an operation instruction to be input and a preset gesture, one hand holds the self-timer rod, and the other hand makes the preset gesture, namely, the mobile terminal can execute the operation instruction to be executed.
Preferably, as shown in figure 7,
the region determining module 601 includes:
a face detection unit 6011 configured to detect a face image region in the image preview interface;
a setting unit 6012, configured to set, as the gesture input region, one of regions other than the face image region in the image preview interface.
Preferably, the mobile terminal further includes:
the display module 604 is configured to display a boundary line of the gesture input area in an image preview interface;
and a message prompt module 605, configured to output a prompt message prompting the user to perform gesture input in the gesture input area.
Preferably, the action detection module 602 may include:
a hand image analysis unit 6021 configured to analyze whether a hand image exists in the image preview interface;
a hand state detection unit 6022 configured to detect a presence state of a hand displayed in a hand image if the hand image exists;
the determining unit 6023 is configured to determine that a preset gesture exists in the image preview interface when it is determined that the hand moves in a predetermined state and according to a predetermined trajectory within a predetermined time period.
Preferably, the instruction execution module 603 comprises:
a first determining unit 6031, configured to determine, according to the hand state of the preset gesture, a photographing operation function that needs to be executed by the photographing operation instruction;
a second determination unit 6032 configured to determine an adjustment direction when the photographing operation function is executed according to the predetermined trajectory action;
an executing unit 6033, configured to execute the photographing operation instruction according to the determined photographing operation function and the corresponding adjusting direction.
Preferably, the mobile terminal 600 further includes:
the button display module is used for displaying a touch button for selecting gesture input in the image preview interface;
and the triggering module is used for executing the step of determining the gesture input area in the image preview interface if the touch button is detected to be clicked by the user.
Adopt above-mentioned implementation mobile terminal can make the operation process who shoots by oneself simple, convenient, avoids appearing using and need to move mobile terminal near many times when shooing from the rapping bar, causes the problem that the process of shooing is loaded down with trivial details and need adjust the parameter of shooing repeatedly.
Fifth embodiment
Fig. 7 is a block diagram of a mobile terminal 700 according to another embodiment of the present invention, the mobile terminal shown in fig. 8 including: at least one processor 701, memory 702, a photographing component 703, and a user interface 704. The various components in the mobile terminal 700 are coupled together by a bus system 705. It is understood that the bus system 705 is used to enable communications among the components. The bus system 705 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various busses are labeled in figure 7 as the bus system 705.
The user interface 704 may include, among other things, a display or a pointing device (e.g., a touch pad or touch screen, etc.).
It is to be understood that the memory 702 in embodiments of the present invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SDRAM, ddr SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The memory 802 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 702 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof: an operating system 7021 and application programs 7022.
The operating system 7021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 7022 includes various applications, such as a Media Player (Media Player), a Browser (Browser), and the like, for implementing various application services. Programs that implement methods in accordance with embodiments of the present invention can be included within application program 7022.
In an embodiment of the present invention, the program or instructions stored by memory 702 may be, in particular, stored in application program 7022. The processor 701 is configured to determine a gesture input area in an image preview interface when the mobile terminal is currently in the photographed image preview interface; detecting whether a preset gesture exists in the gesture input area; and if the preset gesture exists, executing a photographing operation instruction corresponding to the preset gesture.
The method disclosed in the above embodiments of the present invention may be applied to the processor 701, or implemented by the processor 701. The processor 701 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 701. The Processor 701 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 702, and the processor 701 reads the information in the memory 702 and performs the steps of the above method in combination with the hardware thereof.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Specifically, the processor 701 is further configured to: detecting a face image area in an image preview interface; setting one area out of the face image area in the image preview interface as the gesture input area.
Specifically, the processor 701 is further configured to: displaying a boundary line of the gesture input area in an image preview interface; and outputting a prompt message for prompting the user to perform gesture input in the gesture input area.
Specifically, the processor 701 is further configured to: analyzing whether a hand image exists in the gesture input area; if the hand image exists, detecting the presentation state of the displayed hand in the hand image; and when the hand moves in a preset state and according to a preset track within a preset time period, determining that a preset gesture exists in the image preview interface.
Specifically, the processor 701 is further configured to: determining a photographing operation function required to be executed by the photographing operation instruction according to the hand state of the preset gesture; determining the adjusting direction when the photographing operation function is executed according to the preset track action; and executing the photographing operation instruction according to the determined photographing operation function and the corresponding adjusting direction.
According to the mobile terminal 700 of the embodiment of the invention, by pre-storing the plurality of preset gestures, when a user uses the selfie stick to perform self-photographing, and when the photographing mode or parameters of the mobile terminal are adjusted, the mobile terminal does not need to be held close, only one hand holds the selfie stick and the other hand makes corresponding preset track actions through the preset hand state under the image preview interface, the mobile terminal can execute the operation instruction to be executed, and therefore, the self-photographing operation process is simple and convenient.
Sixth embodiment
Fig. 8 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention. Specifically, the mobile terminal 800 in fig. 8 may be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a vehicle-mounted computer.
The mobile terminal 800 in fig. 8 includes a power supply 810, a memory 820, an input unit 830, a display unit 840, a photographing component 850, a processor 860, a WIFI (Wireless Fidelity) module 870, an audio circuit 880, and an RF circuit 890, wherein the photographing component 850 includes a first camera and a second camera.
The input unit 830 may be used, among other things, to receive user-input information and to generate signal inputs related to user settings and function control of the mobile terminal 800. Specifically, in the embodiment of the present invention, the input unit 830 may include a touch panel 831. The touch panel 831, also referred to as a touch screen, can collect touch operations performed by a user on or near the touch panel 831 (e.g., operations performed by the user on the touch panel 831 using a finger, a stylus, or any other suitable object or accessory), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 831 may include two portions, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 860, and can receive and execute commands sent by the processor 860. In addition, the touch panel 831 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave.
Among them, the display unit 840 may be used to display information input by a user or information provided to the user and various menu interfaces of the mobile terminal. The display unit 840 may include a display panel 841, and the display panel 841 may be alternatively configured in the form of an LCD or an Organic Light-Emitting Diode (OLED), or the like.
It should be noted that the touch panel 831 can overlay the display panel 841 to form a touch display screen, which, when it detects a touch operation thereon or nearby, is passed to the processor 860 to determine the type of touch event, and then the processor 860 provides a corresponding visual output on the touch display screen according to the type of touch event.
The touch display screen comprises an application program interface display area and a common control display area. The arrangement modes of the application program interface display area and the common control display area are not limited, and can be an arrangement mode which can distinguish two display areas, such as vertical arrangement, left-right arrangement and the like. The application interface display area may be used to display an interface of an application. Each interface may contain at least one interface element such as an icon and/or widget desktop control for an application. The application interface display area may also be an empty interface that does not contain any content. The common control display area is used for displaying controls with high utilization rate, such as application icons like setting buttons, interface numbers, scroll bars, phone book icons and the like.
The processor 860 is a control center of the mobile terminal, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the first memory 821 and calling data stored in the second memory 822, thereby performing overall monitoring of the mobile terminal. Optionally, processor 860 may include one or more processing units.
In the embodiment of the present invention, the processor 860 is configured to determine a gesture input area in the image preview interface when the mobile terminal is currently in the photographed image preview interface by calling a software program and/or module stored in the first memory 821 and/or data stored in the second memory 822; detecting whether a preset gesture exists in the gesture input area; and if the preset gesture exists, executing a photographing operation instruction corresponding to the preset gesture.
Specifically, processor 860 is further configured to: detecting a face image area in an image preview interface; setting one area out of the face image area in the image preview interface as the gesture input area.
Specifically, processor 860 is further configured to: displaying a boundary line of the gesture input area in an image preview interface; and outputting a prompt message for prompting the user to perform gesture input in the gesture input area.
Specifically, processor 860 is further configured to: analyzing whether a hand image exists in the gesture input area; if the hand image exists, detecting the presentation state of the displayed hand in the hand image; and when the hand moves in a preset state and according to a preset track within a preset time period, determining that a preset gesture exists in the image preview interface.
Specifically, processor 860 is further configured to: determining a photographing operation function required to be executed by the photographing operation instruction according to the hand state of the preset gesture; determining the adjusting direction when the photographing operation function is executed according to the preset track action; and executing the photographing operation instruction according to the determined photographing operation function and the corresponding adjusting direction.
According to the mobile terminal 800 of the embodiment of the invention, by pre-storing the plurality of preset gestures, when a user uses the selfie stick to perform self-photographing, and when the photographing mode or parameter of the mobile terminal is adjusted, the mobile terminal does not need to be held close, only one hand holds the selfie stick and the other hand makes a corresponding preset track action through a preset hand state under an image preview interface, the mobile terminal can execute an operation instruction to be executed, and therefore, the self-photographing operation process is simple and convenient.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
While the preferred embodiments of the present invention have been described, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.

Claims (8)

1. A control method for photographing is applied to a mobile terminal, and is characterized in that the control method comprises the following steps:
when the mobile terminal is currently in a photographed image preview interface, determining a gesture input area in the image preview interface;
detecting whether preset gestures exist in the image information displayed in the gesture input area;
if the preset gesture exists, executing a photographing operation instruction corresponding to the preset gesture; wherein the step of determining a gesture input area in the image preview interface comprises:
detecting a face image area in an image preview interface;
setting one area out of the facial image area in the image preview interface as the gesture input area, wherein the gesture input area is a display area in which the image preview interface can shoot gesture input of a user.
2. The control method according to claim 1, wherein after the step of determining a gesture input region in an image preview interface, the control method further comprises:
displaying a boundary line of the gesture input area in an image preview interface;
and outputting a prompt message for prompting the user to perform gesture input in the gesture input area.
3. The control method according to claim 1, wherein the step of detecting whether the preset gesture exists in the gesture input area comprises:
analyzing whether a hand image exists in the gesture input area;
if the hand image exists, detecting the presentation state of the displayed hand in the hand image;
and when the hand moves in a preset state and according to a preset track within a preset time period, determining that a preset gesture exists in the image preview interface.
4. The control method according to claim 3, wherein the step of executing the photographing operation instruction corresponding to the preset gesture comprises:
determining a photographing operation function required to be executed by the photographing operation instruction according to the hand state of the preset gesture;
determining the adjusting direction when the photographing operation function is executed according to the preset track action;
and executing the photographing operation instruction according to the determined photographing operation function and the corresponding adjusting direction.
5. A mobile terminal, characterized in that the mobile terminal comprises:
the area determining module is used for determining a gesture input area in the image preview interface when the mobile terminal is currently in the photographed image preview interface;
the action detection module is used for detecting whether preset gestures exist in the image information displayed in the gesture input area;
the command execution module is used for executing a photographing operation command corresponding to a preset gesture if the preset gesture exists;
wherein the region determination module comprises:
a face detection unit configured to detect a face image area in the image preview interface;
the setting unit is used for setting one area out of the face image area in the image preview interface as the gesture input area, and the gesture input area is a display area in which the image preview interface can shoot gesture input of a user.
6. The mobile terminal of claim 5, wherein the mobile terminal further comprises:
the display module is used for displaying the boundary line of the gesture input area in an image preview interface;
and the message prompt module is used for outputting prompt messages for prompting the user to perform gesture input in the gesture input area.
7. The mobile terminal of claim 5, wherein the action detection module comprises:
a hand image analysis unit for analyzing whether a hand image exists in the gesture input area;
a hand state detection unit for detecting a presentation state of a hand displayed in a hand image if the hand image exists;
and the determining unit is used for determining that the preset gesture exists in the image preview interface when the hand is judged to act in a preset state and according to a preset track within a preset time period.
8. The mobile terminal of claim 7, wherein the instruction execution module comprises:
the first judgment unit is used for determining a photographing operation function required to be executed by the photographing operation instruction according to the hand state of the preset gesture;
the second judgment unit is used for determining the adjusting direction when the photographing operation function is executed according to the preset track action;
and the execution unit is used for executing the photographing operation instruction according to the determined photographing operation function and the corresponding adjusting direction.
CN201610614806.9A 2016-07-29 2016-07-29 Photographing control method and mobile terminal Active CN106250021B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610614806.9A CN106250021B (en) 2016-07-29 2016-07-29 Photographing control method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610614806.9A CN106250021B (en) 2016-07-29 2016-07-29 Photographing control method and mobile terminal

Publications (2)

Publication Number Publication Date
CN106250021A CN106250021A (en) 2016-12-21
CN106250021B true CN106250021B (en) 2021-01-08

Family

ID=57607201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610614806.9A Active CN106250021B (en) 2016-07-29 2016-07-29 Photographing control method and mobile terminal

Country Status (1)

Country Link
CN (1) CN106250021B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106843699A (en) * 2016-12-22 2017-06-13 惠州Tcl移动通信有限公司 One kind slides sensing photographic method, system and mobile terminal
CN106991653A (en) * 2017-03-07 2017-07-28 北京小米移动软件有限公司 Control the method and device of landscaping treatment
CN107105093A (en) * 2017-04-18 2017-08-29 广东欧珀移动通信有限公司 Camera control method, device and terminal based on hand track
CN109240494B (en) * 2018-08-23 2023-09-12 京东方科技集团股份有限公司 Control method, computer-readable storage medium and control system for electronic display panel
CN110070478B (en) * 2018-08-24 2020-12-04 北京微播视界科技有限公司 Deformation image generation method and device
CN110058777B (en) * 2019-03-13 2022-03-29 华为技术有限公司 Method for starting shortcut function and electronic equipment
CN115484394B (en) * 2021-06-16 2023-11-14 荣耀终端有限公司 Guide use method of air separation gesture and electronic equipment
CN113419808A (en) * 2021-07-14 2021-09-21 武汉天睿世纪数字科技有限公司 Map marking method, map marking device, display control equipment and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102188819A (en) * 2010-03-11 2011-09-21 鼎亿数码科技(上海)有限公司 Device and method for controlling video game
CN102929549A (en) * 2012-10-24 2013-02-13 广东欧珀移动通信有限公司 Camera self-photographing method and mobile terminal thereof
CN103167230A (en) * 2011-12-17 2013-06-19 富泰华工业(深圳)有限公司 Electronic equipment and method controlling shooting according to gestures thereof
WO2014019478A1 (en) * 2012-07-30 2014-02-06 Tencent Technology (Shenzhen) Company Limited Method and mobile terminal device for image operation
CN104020843A (en) * 2013-03-01 2014-09-03 联想(北京)有限公司 Information processing method and electronic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102188819A (en) * 2010-03-11 2011-09-21 鼎亿数码科技(上海)有限公司 Device and method for controlling video game
CN103167230A (en) * 2011-12-17 2013-06-19 富泰华工业(深圳)有限公司 Electronic equipment and method controlling shooting according to gestures thereof
WO2014019478A1 (en) * 2012-07-30 2014-02-06 Tencent Technology (Shenzhen) Company Limited Method and mobile terminal device for image operation
CN102929549A (en) * 2012-10-24 2013-02-13 广东欧珀移动通信有限公司 Camera self-photographing method and mobile terminal thereof
CN104020843A (en) * 2013-03-01 2014-09-03 联想(北京)有限公司 Information processing method and electronic device

Also Published As

Publication number Publication date
CN106250021A (en) 2016-12-21

Similar Documents

Publication Publication Date Title
CN106250021B (en) Photographing control method and mobile terminal
EP3661187B1 (en) Photography method and mobile terminal
US10136069B2 (en) Apparatus and method for positioning image area using image sensor location
CN106406710B (en) Screen recording method and mobile terminal
US9626013B2 (en) Imaging apparatus
US9185286B2 (en) Combining effective images in electronic device having a plurality of cameras
WO2019001152A1 (en) Photographing method and mobile terminal
WO2018192390A1 (en) Photographing method of mobile terminal, and mobile terminal
CN107390990B (en) Image adjusting method and mobile terminal
US20190215467A1 (en) Apparatus and method for processing an image in device
US20130222663A1 (en) User interface for a digital camera
US10191554B2 (en) Display apparatus and controlling method thereof
US20140184848A1 (en) Imaging apparatus and method for controlling the same
JP6004756B2 (en) Display control apparatus and control method thereof
CN107172347B (en) Photographing method and terminal
WO2023072156A1 (en) Photographing method, photographing apparatus, electronic device, and readable storage medium
WO2023083089A1 (en) Photographing control display method and apparatus, and electronic device and medium
CN106713742B (en) Shooting method and mobile terminal
CN112887618B (en) Video shooting method and device
CN107315529B (en) Photographing method and mobile terminal
US9621809B2 (en) Display control apparatus and method for controlling the same
KR20100088248A (en) Method for controlling user interface and mobile device employing the same
CN116711318A (en) Shooting device, control method thereof and storage medium
CN110519433B (en) Camera application control method, device, equipment and storage medium
CA2807866A1 (en) User interface for a digital camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant