CN107105093A - Camera control method, device and terminal based on hand track - Google Patents

Camera control method, device and terminal based on hand track Download PDF

Info

Publication number
CN107105093A
CN107105093A CN201710253799.9A CN201710253799A CN107105093A CN 107105093 A CN107105093 A CN 107105093A CN 201710253799 A CN201710253799 A CN 201710253799A CN 107105093 A CN107105093 A CN 107105093A
Authority
CN
China
Prior art keywords
hand
gesture
movement locus
track
operational order
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710253799.9A
Other languages
Chinese (zh)
Inventor
梁昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710253799.9A priority Critical patent/CN107105093A/en
Publication of CN107105093A publication Critical patent/CN107105093A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Environmental & Geological Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the present invention provides a kind of camera control method based on hand track, device and terminal.This method comprises the following steps:When detecting trigger signal, the image of continuous collecting predeterminable area;When it is default gesture to detect the hand in described image, the movement locus of the hand is obtained;According to the corresponding operational order of the trace generator;Corresponding operation is carried out to camera module according to the operational order.The present invention just carries out movement locus identification after prediction gesture is detected, reduces the possibility of maloperation by the way that default gesture is combined with the movement locus of gesture.

Description

Camera control method, device and terminal based on hand track
Technical field
The present invention relates to the communications field, more particularly to a kind of camera control method based on hand track, device and end End.
Background technology
With the development of mobile phone photograph technology, people are to the demand of mobile phone photograph more and more higher.Nowadays, for taking pictures Quick triggering mode had a many kinds, such as shooting style based on gesture, the shooting style based on voice command, be based on Shooting style of facial expression etc..
But, for a variety of operations of the photographing module in mobile phone, the switching of such as exposal model, U.S. face are set, HDR work( Can switch, Focussing etc., it is difficult to simple use gesture recognizes or human facial expression recognition tackles the photographing operation of such multiple types, Individually using action recognition again it is possible that the possibility of maloperation.
The content of the invention
The embodiment of the present invention provides a kind of camera control method based on hand track, device and terminal, is missed with reduction The possibility of operation.
The embodiment of the present invention provides a kind of camera control method based on hand track, including:
When detecting trigger signal, the image of continuous collecting predeterminable area;
When it is default gesture to detect the hand in described image, the movement locus of the hand is obtained;
According to the corresponding operational order of the trace generator;
Corresponding operation is carried out to camera module according to the operational order.
The embodiments of the invention provide a kind of photographing control device based on hand track, including:
Acquisition module, for when detecting trigger signal, the image of continuous collecting predeterminable area;
Acquisition module, for when it is default gesture to detect the hand in described image, obtaining the motion of the hand Track;
Generation module, for according to the corresponding operational order of the trace generator;
Operation module, for carrying out corresponding operation to camera module according to the operational order.
The embodiments of the invention provide a kind of terminal, including memory, processor and store on a memory and can locate The computer program run on reason device, realizes any described method as described above during the computing device computer program.
From the foregoing, it will be observed that the camera control method provided in an embodiment of the present invention based on hand track detects triggering by working as During signal, the image of continuous collecting predeterminable area;When it is default gesture to detect the hand in described image, the hand is obtained The movement locus in portion;According to the corresponding operational order of the trace generator;Camera module is entered according to the operational order The corresponding operation of row;So as to realize Untouched control to taking pictures, also, it is by by the motion rail of default gesture and gesture Mark is combined, and is just carried out movement locus identification after prediction gesture is detected, is reduced the possibility of maloperation.
Brief description of the drawings
Fig. 1 is the scene signal of the camera control method based on hand track and device in one embodiment of the present invention Figure.
Fig. 2 be one embodiment of the present invention in the camera control method based on hand track the first flow chart.
Fig. 3 be one embodiment of the present invention in the camera control method based on hand track second of flow chart.
Fig. 4 be one embodiment of the present invention in the camera control method based on hand track the third flow chart.
Fig. 5 be one embodiment of the present invention in the camera control method based on hand track the 4th kind of flow chart.
Fig. 6 be one embodiment of the present invention in the photographing control device based on hand track the first structure chart.
Fig. 7 be one embodiment of the present invention in the photographing control device based on hand track second of structure chart.
Fig. 8 be one embodiment of the present invention in the photographing control device based on hand track the third structure chart.
Fig. 9 be one embodiment of the present invention in terminal structure chart.
Embodiment
It refer to schema, wherein identical element numbers represent identical component, principle of the invention is to implement one Illustrated in appropriate computing environment.The following description is that, based on the illustrated specific embodiment of the invention, it should not be by It is considered as the limitation present invention other specific embodiments not detailed herein.
In the following description, specific embodiment of the invention will be referred to as the operation performed by one or multi-section computer The step of and symbol illustrate, unless otherwise stating clearly.Therefore, it will appreciate that these steps and operation, be carried for several times wherein having To be performed by computer, include by representing with the computer disposal list of the electronic signal of the data in a structuring pattern Member is manipulated.At this manipulation transforms data or the position being maintained in the memory system of the computer, it can match somebody with somebody again Put or change in a manner familiar to those skilled in the art in addition the running of the computer.The data knot that the data are maintained Structure is the provider location of the internal memory, and it has the particular characteristics as defined in the data format.But, the principle of the invention is with above-mentioned Word illustrates, it is not represented as a kind of limitation, those skilled in the art will appreciate that plurality of step as described below and Operation also may be implemented among hardware.
Fig. 1 is refer to, camera control method and device provided in an embodiment of the present invention based on hand track are mainly applied In the terminals such as the mobile phone with camera module, PAD.When specifically used, when user needs to carry out Untouched control with terminal When taking pictures, input instruction is to the terminal so that the terminal detects trigger signal;When terminal detects trigger signal, terminal Camera module continuous collecting predeterminable area image;When detecting the hand in image and being default gesture (such as in Fig. 1 Clench fist gesture), the movement locus (such as the track A in Fig. 1) of the hand is obtained, it is then corresponding according to the trace generator Operational order (such as opening flash lamp or close flash lamp instruction, photographing instruction);According to the operational order to shooting mould Group carries out corresponding operation, for example, carrying out flash lamp switching manipulation or taking pictures.
It refer to Fig. 2, Fig. 2 is the flow chart of the camera control method based on hand track in the embodiment of the present invention. In the present embodiment, it is somebody's turn to do the camera control method based on hand track and comprises the following steps:
S101, when detecting trigger signal, the image of continuous collecting predeterminable area.
In this step, terminal needs to operate the camera module of the terminal to be taken pictures by the way of contactless When, input instruction gives the terminal.Terminal generates trigger signal after the instruction scheduled time is received.So that the shooting of the terminal Module starts to gather the image of predeterminable area.The predeterminable area is normally at the angular field of view in front of the camera lens for referring to camera module It is interior.
S102, when it is default gesture to detect the hand in described image, obtain the movement locus of hand.
In this step, wherein whether the image that terminal-pair is collected carries out analyzing and processing one by one, so as to judge Whether the gesture that there is hand and hand is default gesture.
In actual applications, referring to Fig. 2 and Fig. 3, step S102 includes following sub-step:
S1021, the topography of the hand region obtained in described image.In this step, first according to hand Colouring information or depth information extract the topography of hand region from described image, and to extracting Image carries out the operation such as denoising.
S1022, the current gesture according to the topography identification hand.In practical application, in the terminal in advance The characteristic information for multiple default gestures that are stored with, naturally it is also possible to which only there is the characteristic information of a default gesture.Practical operation In, matched by the characteristic information by the hand in the topography respectively with default gesture, when in the topography When the characteristic information of hand is matched with the characteristic information of wherein one default gesture, it is the default hand to illustrate the current gesture of the hand Gesture, so as to complete the identification of current gesture.
For example, the multiple default gestures stored in the terminal include:Gesture, the scissors hand gesture of clenching fist are opened one's hand with palm Gesture etc..When detect deserve before gesture be clench fist gesture, scissors hand gesture with palm open in gesture it is a kind of match when, explanation Gesture is exactly that gesture wherein matched before deserving.
S1023, when the current gesture is the default gesture, obtain the movement locus of the hand.
In step S1023, when the current gesture is the default gesture, since after acquisition current frame image The coordinate points of the predeterminated position of hand in continuous multiple image;The movement locus of the hand is generated according to the plurality of coordinate points. For example, the hand of user is reached after the presumptive area that the terminal is monitored with gesture of clenching fist, when terminal recognition goes out holding for the hand After boxer's gesture, since that two field picture for identifying the gesture of clenching fist, follow-up multiple image is identified, and obtain it The coordinate points of the predeterminated position of middle hand.The predeterminated position is for example used as using the position of the geometric center of hand.Continuously acquire many After individual discrete coordinate points, geometric locus fitting is carried out to the plurality of coordinate points, so as to get the movement locus of hand.
S103, according to the corresponding operational order of the trace generator.
In certain embodiments, referring to Fig. 2 and Fig. 4, correspondence can be obtained only according to the movement locus Operational order, therefore, step S103 include following sub-step:
S1031, the movement locus matched with the track that prestores in presetting database, generate matching result.Should The one-to-one mapping relations of movement locus and operational order are previously stored with the database of terminal.For example, the fortune of storage The quantity of dynamic rail mark has multiple, and operational order is identical with its number.For example, A tracks correspondence drives flash lamp, B tracks correspondence, which is closed, dodges Light lamp, C tracks correspondence opens U.S. face function, and D tracks correspondence closes U.S. face function, and E tracks correspondence is taken pictures.
S1032, corresponding operational order generated according to the matching result.For example, when the movement locus is A tracks, Flash lamp instruction is opened in generation.When the movement locus is B tracks, generation closes the instruction of flash lamp, when the movement locus is C rails During mark, U.S. face function is opened in generation;Certainly, it is not limited to this.
Referring to Fig. 2 and Fig. 5, in further embodiments, the species of gesture can be preset according to this and is somebody's turn to do Movement locus generates operational order.Therefore, step S103 includes following sub-step:
S1033, the fisrt feature information according to the acquisition operation of the species of the default gesture.In this step, this first Characteristic information for operation object, the relevant operation of such as flash lamp, Focussing it is relevant operate, U.S. face functional switch it is relevant Operation etc..For example, F gestures correspondence flash operation, the operation of G gestures correspondence Focussing, the U.S. face functional switch of H gestures correspondence Operation.
S1034, the second feature information according to movement locus acquisition operation.In this step, the second feature is believed Cease the type for operation.(flash lamp, U.S. face, take pictures) is opened or focal length stretching etc. for example, A tracks correspondence, and B tracks correspondence (is dodged Light lamp, U.S. face, take pictures) close or the operation such as focal length shortens.For example, the desired guiding trajectory is A tracks.
S1035, corresponding operational order generated according to the fisrt feature information and the second feature information.At this In step, the fisrt feature information and second feature information are combined to obtain corresponding operational order.For example, when this One characteristic information is flash operation, and the second feature information corresponds to (flash lamp, U.S. face, take pictures) and opens operation, then basis The operational order that the fisrt feature information and the second feature information are generated is flash lamp open command.
S104, corresponding operation carried out to camera module according to the operational order.
In this step, connect the example in above-mentioned steps S1035, when be generated to this open flash lamp instruction after, it is known that control The camera module of terminal opens Flash Mode.In this step, corresponding voice can also be produced when carrying out respective operations Prompting, to ensure the accuracy of operation.
From the foregoing, it will be observed that the camera control method provided in an embodiment of the present invention based on hand track detects triggering by working as During signal, the image of continuous collecting predeterminable area;When it is default gesture to detect the hand in described image, the hand is obtained The movement locus in portion;According to the corresponding operational order of the trace generator;Camera module is entered according to the operational order The corresponding operation of row;So as to realize Untouched control to taking pictures, also, it is by by the motion rail of default gesture and gesture Mark is combined, it is to avoid the possibility of maloperation.
It refer to Fig. 6, Fig. 6 is the structure of the photographing control device based on hand track in one embodiment of the present invention Figure.The photographing control device 200 based on hand track includes:Acquisition module 201, acquisition module 202, generation module 203 with And operation module 204.
Wherein, the acquisition module 201 is used for when detecting trigger signal, the image of continuous collecting predeterminable area.Terminal Need when by the way of contactless to operate the camera module of the terminal to be taken pictures, input instruction gives the terminal.Terminal After the instruction scheduled time is received, trigger signal is generated.So that the camera module of the terminal starts to gather the figure of predeterminable area Picture.The predeterminable area is normally at the center region in the angular field of view in front of the camera lens for referring to camera module.
The acquisition module 202 is used to, when it is default gesture to detect the hand in described image, obtain the hand Movement locus.The acquisition module 202 includes:3rd acquiring unit 2021, the acquiring unit 2023 of recognition unit 2022 and the 4th.
Wherein, the 3rd acquiring unit 2021 is used for the topography for obtaining the hand region in described image.The Three acquiring unit 2021 first according to the colouring information or depth information of hand by the topography of hand region from described Extracted in image, and the operation such as denoising is carried out to the image extracted.
Recognition unit 2022 is used for the current gesture that the hand is identified according to the topography.In practical application, The characteristic information of multiple default gestures is previously stored with the terminal, naturally it is also possible to which only there is the feature letter of a default gesture Breath.Recognition unit 2022 is matched by the characteristic information by the hand in the topography respectively with default gesture, when this When the characteristic information of hand in topography is matched with the characteristic information of wherein one default gesture, illustrate the hand works as remote holder Gesture is the default gesture, so as to complete the identification of current gesture.
4th acquiring unit 2023 is used to, when the current gesture is the default gesture, obtain the motion of the hand Track.In practical application, the 4th acquiring unit 2023 is used for when the current gesture is the default gesture, from present frame Image starts to obtain the coordinate points of the predeterminated position of the hand in follow-up multiple image;And for being generated according to the plurality of coordinate points The movement locus of the hand is when the current gesture is the default gesture.4th acquiring unit 2023 is from current frame image Start to obtain the coordinate points of the predeterminated position of the hand in follow-up multiple image;The hand is generated according to the plurality of coordinate points Movement locus.For example, the hand of user is reached after the presumptive area that the terminal is monitored with gesture of clenching fist, when terminal recognition goes out this After the gesture of clenching fist of hand, since that two field picture for identifying the gesture of clenching fist, follow-up multiple image is identified, And obtain the coordinate points of the wherein predeterminated position of hand.The predeterminated position is for example used as using the position of the geometric center of hand.Even After the continuous multiple discrete coordinate points of acquisition, geometric locus fitting is carried out to the plurality of coordinate points, so as to get the fortune of hand Dynamic rail mark.
The generation module 203 is used for according to the corresponding operational order of the trace generator.
In certain embodiments, as shown in fig. 7, the generation module 203 includes:First acquisition unit 2031, second is obtained The generation unit 2033 of unit 2032 and first.
Wherein, first acquisition unit 2031 is used to be believed according to the fisrt feature that the species of the default gesture obtains operation Breath.The fisrt feature information for operation object, the relevant operation of such as flash lamp, Focussing it is relevant operate, U.S. face work( Relevant operation etc. can be switched.For example, F gestures correspondence flash operation, the operation of G gestures correspondence Focussing, H gestures correspondence are beautiful The operation of face functional switch.
Second acquisition unit 2032 is used for the second feature information that operation is obtained according to the movement locus.For example, A tracks Correspondence (flash lamp, U.S. face, take pictures) is opened or focal length stretching etc., and B tracks correspondence (flash lamp, U.S. face, take pictures) is closed or focal length The operation such as shortening.For example, the desired guiding trajectory is A tracks.
First generation unit 2033 is used to generate correspondence according to the fisrt feature information and the second feature information Operational order.First generation unit 2033 combines the fisrt feature information and second feature information to obtain corresponding Operational order.For example, when the fisrt feature information is flash operation, the second feature information correspond to (flash lamp, U.S. face, Take pictures) open operation, then the operational order generated according to the fisrt feature information and the second feature information is opened for flash lamp Open instruction.
In further embodiments, as shown in figure 8, the generation module 203 is generated including matching unit 2034 and second Unit 2035.
Matching unit 2034 is used to be matched the movement locus with the track that prestores in presetting database, generation With result.The one-to-one mapping relations of movement locus and operational order are previously stored with the database of the terminal.For example, The quantity of the movement locus of storage has multiple, and operational order is identical with its number.For example, A tracks correspondence drives flash lamp, B tracks Correspondence closes flash lamp, and C tracks correspondence opens U.S. face function, and D tracks correspondence closes U.S. face function, and E tracks correspondence is taken pictures.
Second generation unit 2035 is used to generate corresponding operational order according to the matching result.For example, when the motion When track is A tracks, flash lamp instruction is opened in generation.When the movement locus be B tracks when, generation close flash lamp instruction, when When the movement locus is C tracks, the instruction of U.S. face function is opened in generation;Certainly, it is not limited to this.
The operation module 204 is used to carry out corresponding operation to camera module according to the operational order.For example perform and open Open Flash Mode or close Flash Mode, open U.S. face pattern or close U.S. face pattern, adjustment focal length etc..The operation mould Block 204 can carry out corresponding operation to camera module according to the operational order and produce corresponding voice message
From the foregoing, it will be observed that the photographing control device provided in an embodiment of the present invention based on hand track is being examined by acquisition module When measuring trigger signal, the image of continuous collecting predeterminable area;Hand of the acquisition module in described image is detected is default During gesture, the movement locus of the hand is obtained;Generation module is according to the corresponding operational order of the trace generator;Operation Module carries out corresponding operation according to the operational order to camera module, so that the long-range of complete paired terminal camera function non-connects Touch is controlled, also, it is by the way that default gesture is combined with the movement locus of hand, can just be detected after the default gesture of detection The movement locus of hand, it is to avoid the possibility of maloperation.
Refer to Fig. 9, the embodiment of the present invention also provides a kind of terminal 300, the terminal 300 can include radio frequency (RF, Radio Frequency) circuit 301, include the memories 302, defeated of one or more computer-readable recording mediums Enter unit 303, display unit 304, sensor 305, voicefrequency circuit 306, Wireless Fidelity (WiFi, Wireless Fidelity) Module 307, include the part such as one or the processor 308 and power supply 309 of more than one processing core.This area skill Art personnel are appreciated that the restriction of the terminal structure shown in Fig. 9 not structure paired terminal, can include more more or more than illustrating Few part, either combines some parts or different parts arrangement.Wherein:
RF circuits 301 can be used for receive and send messages or communication process in, the reception and transmission of signal, especially, by base station After downlink information is received, transfer to one or more than one processor 308 is handled;In addition, being sent to up data are related to Base station.Generally, RF circuits 301 include but is not limited to antenna, at least one amplifier, tuner, one or more oscillators, use Family identity module (SIM, Subscriber Identity Module) card, transceiver, coupler, low-noise amplifier (LNA, Low Noise Amplifier), duplexer etc..In addition, RF circuits 301 can also pass through radio communication and network and its His equipment communication.The radio communication can use any communication standard or agreement, including but not limited to global system for mobile telecommunications system Unite (GSM, Global System of Mobile communication), general packet radio service (GPRS, General Packet Radio Service), CDMA (CDMA, Code Division Multiple Access), wideband code division it is many Location (WCDMA, Wideband Code Division Multiple Access), Long Term Evolution (LTE, Long Term Evolution), Email, Short Message Service (SMS, Short Messaging Service) etc..
Memory 302 can be used for storage software program and module, and processor 308 is stored in memory 302 by operation Software program and module, so as to perform various function application and data processing.Memory 302 can mainly include storage journey Sequence area and storage data field, wherein, the application program (ratio that storing program area can be needed for storage program area, at least one function Such as sound-playing function, image player function) etc.;Storage data field can be stored uses created data according to terminal (such as voice data, phone directory etc.) etc..In addition, memory 302 can include high-speed random access memory, it can also include Nonvolatile memory, for example, at least one disk memory, flush memory device or other volatile solid-state parts.Phase Ying Di, memory 302 can also include Memory Controller, to provide processor 308 and input block 303 to memory 302 Access.
Input block 303 can be used for the numeral or character information for receiving input, and generation to be set with user and function The relevant keyboard of control, mouse, action bars, optics or the input of trace ball signal.Specifically, in a specific embodiment In, input block 303 may include touch sensitive surface and other input equipments.Touch sensitive surface, also referred to as touch display screen or tactile Control plate, collect user on or near it touch operation (such as user using any suitable object such as finger, stylus or Operation of the annex on touch sensitive surface or near touch sensitive surface), and filled according to the corresponding connection of formula set in advance driving Put.Optionally, touch sensitive surface may include both touch detecting apparatus and touch controller.Wherein, touch detecting apparatus is examined The touch orientation of user is surveyed, and detects the signal that touch operation is brought, touch controller is transmitted a signal to;Touch controller from Touch information is received on touch detecting apparatus, and is converted into contact coordinate, then gives processor 308, and can reception processing Order that device 308 is sent simultaneously is performed.Furthermore, it is possible to a variety of using resistance-type, condenser type, infrared ray and surface acoustic wave etc. Type realizes touch sensitive surface.Except touch sensitive surface, input block 303 can also include other input equipments.Specifically, other are defeated Physical keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse can be included but is not limited to by entering equipment One or more in mark, action bars etc..
Display unit 304 can be used for the various of the information that is inputted by user of display or the information for being supplied to user and terminal Graphical user interface, these graphical user interface can be made up of figure, text, icon, video and its any combination.Display Unit 304 may include display panel, optionally, can using liquid crystal display (LCD, Liquid Crystal Display), The forms such as Organic Light Emitting Diode (OLED, Organic Light-Emitting Diode) configure display panel.Further , touch sensitive surface can cover display panel, after touch sensitive surface detects the touch operation on or near it, send processing to Device 308 is provided accordingly on a display panel with determining the type of touch event with preprocessor 308 according to the type of touch event Visual output.Although in fig .9, touch sensitive surface is that input is realized as two independent parts and is inputted with display panel Function, but in some embodiments it is possible to by touch sensitive surface and display panel it is integrated and realize input and output function.
Terminal may also include at least one sensor 305, such as optical sensor, motion sensor and other sensors. Specifically, optical sensor may include ambient light sensor and proximity transducer, wherein, ambient light sensor can be according to ambient light Light and shade adjust the brightness of display panel, proximity transducer can close display panel and/or the back of the body when terminal is moved in one's ear Light.As one kind of motion sensor, gravity accelerometer can detect in all directions (generally three axles) acceleration Size, can detect that size and the direction of gravity when static, available for identification mobile phone posture application (such as horizontal/vertical screen switching, Dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap) etc.;It can also configure as terminal The other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared ray sensor, will not be repeated here.
Voicefrequency circuit 306, loudspeaker, microphone can provide the COBBAIF between user and terminal.Voicefrequency circuit 306 can Electric signal after the voice data received is changed, is transferred to loudspeaker, and voice signal output is converted to by loudspeaker;It is another The voice signal of collection is converted to electric signal by aspect, microphone, by voicefrequency circuit 306 receive after be converted to voice data, then After voice data output processor 308 is handled, through RF circuits 301 to be sent to such as another terminal, or by voice data Output is to memory 302 so as to further processing.Voicefrequency circuit 306 is also possible that earphone jack, with provide peripheral hardware earphone with The communication of terminal.
WiFi belongs to short range wireless transmission technology, and terminal can help user's transceiver electronicses postal by WiFi module 307 Part, browse webpage and access streaming video etc., it has provided the user wireless broadband internet and accessed.Although Fig. 8 is shown WiFi module 307, but it is understood that, it is simultaneously not belonging to must be configured into for terminal, can not change as needed completely Become in the essential scope of invention and omit.
Processor 308 is the control centre of terminal, using various interfaces and the various pieces of connection whole mobile phone, is led to Cross operation or perform and be stored in software program and/or module in memory 302, and call and be stored in memory 302 Data, perform the various functions and processing data of terminal, so as to carry out integral monitoring to mobile phone.Optionally, processor 308 can be wrapped Include one or more processing cores;It is preferred that, processor 308 can integrated application processor and modem processor, wherein, should Operating system, user interface and application program etc. are mainly handled with processor, modem processor mainly handles radio communication. It is understood that above-mentioned modem processor can not also be integrated into processor 308.
Terminal also includes the power supply 309 (such as battery) powered to all parts, it is preferred that power supply can pass through power supply pipe Reason system and processor 308 are logically contiguous, so as to realize management charging, electric discharge and power managed by power-supply management system Etc. function.Power supply 309 can also include one or more direct current or AC power, recharging system, power failure inspection The random component such as slowdown monitoring circuit, power supply changeover device or inverter, power supply status indicator.
Although not shown, mobile terminal can also include camera, bluetooth module etc., will not be repeated here.Specifically at this In embodiment, the processor 308 in terminal can be according to following instruction, by the process pair of one or more application program The executable file answered is loaded into memory 302, and the application journey stored in the memory 302 is run by processor 308 Sequence, so as to realize following functions:
When detecting trigger signal, the image of continuous collecting predeterminable area;
When it is default gesture to detect the hand in described image, the movement locus of the hand is obtained;
According to the corresponding operational order of the trace generator;
Corresponding operation is carried out to camera module according to the operational order.
Terminal provided in an embodiment of the present invention is by when detecting trigger signal, the image of continuous collecting predeterminable area; When it is default gesture to detect the hand in described image, the movement locus of the hand is obtained;According to the movement locus Generate corresponding operational order;Corresponding operation is carried out to camera module according to the operational order;So as to realize to taking pictures Untouched control, also, it is by the way that default gesture is combined with the movement locus of gesture, after prediction gesture is detected Movement locus identification is carried out, the possibility of maloperation is reduced.
There is provided herein the various operations of embodiment.In one embodiment, described one or more operations can be with structure The computer-readable instruction stored on into one or more computer-readable mediums, it will make to succeed in one's scheme when being performed by electronic equipment Calculate equipment and perform the operation.Describing the orders of some or all of operations, to should not be construed as to imply that these operations necessarily suitable Sequence correlation.It will be appreciated by those skilled in the art that the alternative sequence of the benefit with this specification.Furthermore, it is to be understood that Not all operation must exist in each embodiment provided in this article.
Moreover, word " preferably " used herein means serving as example, example or illustration.Feng Wen is described as " preferably " any aspect or design be not necessarily to be construed as than other aspect or design it is more favourable.On the contrary, the use purport of word " preferably " Concept is being proposed in a concrete fashion.Term "or" as used in this application is intended to mean the "or" included and non-excluded "or".I.e., unless otherwise or clear from the context, " X means that nature includes any one of arrangement using A or B ". That is, if X uses A;X uses B;Or X uses A and B both, then " X is met using A or B " in foregoing any example.
Although moreover, the disclosure, this area skill has shown and described relative to one or more implementations Art personnel are based on the reading to the specification and drawings and understand it will be appreciated that equivalent variations and modification.The disclosure include it is all this The modifications and variations of sample, and be limited only by the scope of the following claims.Particularly with by said modules (such as element, Resource etc.) various functions that perform, it is intended to correspond to the specified work(for performing the component for describing the term of such component The random component (unless otherwise instructed) of energy (for example it is functionally of equal value), it is illustrated herein with execution in structure The disclosure exemplary implementations in function open structure it is not equivalent.Although in addition, the special characteristic of the disclosure is Through being disclosed relative to the only one in some implementations, but this feature can with such as can be to given or application-specific For be expect and other favourable implementations other one or more combinations of features.Moreover, with regard to term " comprising ", " tool Have ", " containing " or its deformation be used in embodiment or claim for, such term be intended to with term The similar mode of "comprising" includes.
Each functional unit in the embodiment of the present invention can be integrated in a processing module or unit list Solely be physically present, can also two or more units be integrated in a module.Above-mentioned integrated module can both be used The form of hardware is realized, it would however also be possible to employ the form of software function module is realized.If the integrated module is with software function The form of module is realized and is situated between as independent production marketing or in use, an embodied on computer readable storage can also be stored in In matter.Storage medium mentioned above can be read-only storage, disk or CD etc..Above-mentioned each device or system, can be with Perform the method in correlation method embodiment.
In summary, although the present invention it is disclosed above with preferred embodiment, but above preferred embodiment and be not used to limit The system present invention, one of ordinary skill in the art without departing from the spirit and scope of the present invention, can make various changes and profit Adorn, therefore protection scope of the present invention is defined by the scope that claim is defined.

Claims (12)

1. a kind of camera control method based on hand track, it is characterised in that including:
When detecting trigger signal, the image of continuous collecting predeterminable area;
When it is default gesture to detect the hand in described image, the movement locus of the hand is obtained;
According to the corresponding operational order of the trace generator;
Corresponding operation is carried out to camera module according to the operational order.
2. the camera control method according to claim 1 based on hand track, it is characterised in that described according to the fortune The step of dynamic Track Pick-up corresponding operational order, includes:
The fisrt feature information of operation is obtained according to the species of the default gesture;
The second feature information of operation is obtained according to the movement locus;
Corresponding operational order is generated according to the fisrt feature information and the second feature information.
3. the camera control method according to claim 1 based on hand track, it is characterised in that described according to the fortune The step of dynamic Track Pick-up corresponding operational order, includes:
The movement locus is matched with the track that prestores in presetting database, matching result is generated;
Corresponding operational order is generated according to the matching result.
4. the camera control method based on hand track according to claim any one of 1-3, it is characterised in that described to work as Detect the hand in described image includes to preset the step of the movement locus that the hand is obtained during gesture:
Obtain the topography of the hand region in described image;
The current gesture of the hand is identified according to the topography;
When the current gesture is the default gesture, the movement locus of the hand is obtained.
5. the camera control method according to claim 4 based on hand track, it is characterised in that described when described current Include when gesture is the default gesture, the step of the movement locus for obtaining the hand:
When the current gesture is the default gesture, hand since follow-up multiple image is obtained current frame image The coordinate points of predeterminated position;
The movement locus of the hand is generated according to the plurality of coordinate points.
6. the camera control method according to claim 1 based on hand track, it is characterised in that described according to the behaviour The step of carrying out corresponding operation to camera module is instructed to include:
Corresponding operation is carried out to camera module according to the operational order and corresponding voice message is produced.
7. a kind of photographing control device based on hand track, it is characterised in that including:
Acquisition module, for when detecting trigger signal, the image of continuous collecting predeterminable area;
Acquisition module, for when it is default gesture to detect the hand in described image, obtaining the movement locus of the hand;
Generation module, for according to the corresponding operational order of the trace generator;
Operation module, for carrying out corresponding operation to camera module according to the operational order.
8. the photographing control device according to claim 7 based on hand track, it is characterised in that the generation module bag Include:
First acquisition unit, the fisrt feature information for obtaining operation according to the species of the default gesture;
Second acquisition unit, the second feature information for obtaining operation according to the movement locus;
First generation unit, for being referred to according to the fisrt feature information and the corresponding operation of second feature information generation Order.
9. the photographing control device according to claim 7 based on hand track, it is characterised in that the generation module bag Include:
Matching unit, for the movement locus to be matched with the track that prestores in presetting database, generates matching result;
Second generation unit, for generating corresponding operational order according to the matching result.
10. the photographing control device based on hand track according to claim any one of 7-9, it is characterised in that described Acquisition module includes:
3rd acquiring unit, the topography for obtaining the hand region in described image;
Recognition unit, the current gesture for identifying the hand according to the topography;
4th acquiring unit, for when the current gesture is the default gesture, obtaining the movement locus of the hand.
11. the photographing control device according to claim 10 based on hand track, it is characterised in that the described 4th obtains Unit is used for when the current gesture is the default gesture, the hand since follow-up multiple image is obtained current frame image The coordinate points of the predeterminated position in portion;And for generating the movement locus of the hand according to the plurality of coordinate points.
12. a kind of terminal, it is characterised in that including memory, processor and store on a memory and can transport on a processor Capable computer program, realizes the method as described in claim 1-6 is any during the computing device computer program.
CN201710253799.9A 2017-04-18 2017-04-18 Camera control method, device and terminal based on hand track Pending CN107105093A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710253799.9A CN107105093A (en) 2017-04-18 2017-04-18 Camera control method, device and terminal based on hand track

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710253799.9A CN107105093A (en) 2017-04-18 2017-04-18 Camera control method, device and terminal based on hand track

Publications (1)

Publication Number Publication Date
CN107105093A true CN107105093A (en) 2017-08-29

Family

ID=59657067

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710253799.9A Pending CN107105093A (en) 2017-04-18 2017-04-18 Camera control method, device and terminal based on hand track

Country Status (1)

Country Link
CN (1) CN107105093A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107660039A (en) * 2017-09-26 2018-02-02 哈尔滨拓博科技有限公司 A kind of lamp control system for identifying dynamic gesture
CN110187771A (en) * 2019-05-31 2019-08-30 努比亚技术有限公司 Gesture interaction method, device, wearable device and computer storage medium high up in the air
WO2019218521A1 (en) * 2018-05-14 2019-11-21 Boe Technology Group Co., Ltd. Gesture recognition apparatus, control method thereof, and display apparatus
CN112418080A (en) * 2020-11-20 2021-02-26 江苏奥格视特信息科技有限公司 Finger action recognition method of laser scanning imager
CN112905008A (en) * 2021-01-29 2021-06-04 海信视像科技股份有限公司 Gesture adjustment image display method and display device
CN113325948A (en) * 2020-02-28 2021-08-31 华为技术有限公司 Air-isolated gesture adjusting method and terminal
CN114415927A (en) * 2022-01-05 2022-04-29 广东统信软件有限公司 Photographing method, photographing device, computing equipment and storage medium
CN114460881A (en) * 2022-02-11 2022-05-10 广东好太太智能家居有限公司 Clothes airing equipment control device and method and clothes airing equipment
CN115022549A (en) * 2022-06-27 2022-09-06 影石创新科技股份有限公司 Shooting composition method, shooting composition device, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102857642A (en) * 2012-09-28 2013-01-02 北京蒙恬科技有限公司 Device for automatically shooting document and method thereof
CN103384301A (en) * 2013-07-12 2013-11-06 广东欧珀移动通信有限公司 Method and device for performing autodyne through rear camera and mobile terminal
CN105120144A (en) * 2015-07-31 2015-12-02 小米科技有限责任公司 Image shooting method and device
KR20160034065A (en) * 2014-09-19 2016-03-29 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN106250021A (en) * 2016-07-29 2016-12-21 维沃移动通信有限公司 A kind of control method taken pictures and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102857642A (en) * 2012-09-28 2013-01-02 北京蒙恬科技有限公司 Device for automatically shooting document and method thereof
CN103384301A (en) * 2013-07-12 2013-11-06 广东欧珀移动通信有限公司 Method and device for performing autodyne through rear camera and mobile terminal
KR20160034065A (en) * 2014-09-19 2016-03-29 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN105120144A (en) * 2015-07-31 2015-12-02 小米科技有限责任公司 Image shooting method and device
CN106250021A (en) * 2016-07-29 2016-12-21 维沃移动通信有限公司 A kind of control method taken pictures and mobile terminal

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107660039A (en) * 2017-09-26 2018-02-02 哈尔滨拓博科技有限公司 A kind of lamp control system for identifying dynamic gesture
WO2019218521A1 (en) * 2018-05-14 2019-11-21 Boe Technology Group Co., Ltd. Gesture recognition apparatus, control method thereof, and display apparatus
US11314334B2 (en) 2018-05-14 2022-04-26 Boe Technology Group Co., Ltd. Gesture recognition apparatus, control method thereof, and display apparatus
CN110187771A (en) * 2019-05-31 2019-08-30 努比亚技术有限公司 Gesture interaction method, device, wearable device and computer storage medium high up in the air
CN110187771B (en) * 2019-05-31 2024-04-26 努比亚技术有限公司 Method and device for interaction of air gestures, wearable equipment and computer storage medium
CN113325948B (en) * 2020-02-28 2023-02-07 华为技术有限公司 Air-isolated gesture adjusting method and terminal
CN113325948A (en) * 2020-02-28 2021-08-31 华为技术有限公司 Air-isolated gesture adjusting method and terminal
CN112418080A (en) * 2020-11-20 2021-02-26 江苏奥格视特信息科技有限公司 Finger action recognition method of laser scanning imager
CN112905008A (en) * 2021-01-29 2021-06-04 海信视像科技股份有限公司 Gesture adjustment image display method and display device
CN114415927A (en) * 2022-01-05 2022-04-29 广东统信软件有限公司 Photographing method, photographing device, computing equipment and storage medium
CN114415927B (en) * 2022-01-05 2024-03-26 广东统信软件有限公司 Photographing method, photographing device, computing equipment and storage medium
CN114460881A (en) * 2022-02-11 2022-05-10 广东好太太智能家居有限公司 Clothes airing equipment control device and method and clothes airing equipment
CN115022549A (en) * 2022-06-27 2022-09-06 影石创新科技股份有限公司 Shooting composition method, shooting composition device, computer equipment and storage medium
CN115022549B (en) * 2022-06-27 2024-04-16 影石创新科技股份有限公司 Shooting composition method, shooting composition device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN107105093A (en) Camera control method, device and terminal based on hand track
CN107944380A (en) Personal identification method, device and storage device
CN109426783A (en) Gesture identification method and system based on augmented reality
US11274932B2 (en) Navigation method, navigation device, and storage medium
CN109032734A (en) A kind of background application display methods and mobile terminal
CN107562835A (en) File search method, device, mobile terminal and computer-readable recording medium
CN104951432A (en) Information processing method and device
CN106527949B (en) A kind of unlocked by fingerprint method, apparatus and terminal
CN109508399A (en) A kind of facial expression image processing method, mobile terminal
CN107071129B (en) A kind of bright screen control method and mobile terminal
CN104699501B (en) A kind of method and device for running application program
CN108984064A (en) Multi-screen display method, device, storage medium and electronic equipment
CN109871358A (en) A kind of management method and terminal device
CN107040610A (en) Method of data synchronization, device, storage medium, terminal and server
CN104966086A (en) Living body identification method and apparatus
CN104820546B (en) Function information methods of exhibiting and device
CN109067981A (en) Split screen application switching method, device, storage medium and electronic equipment
CN108958629A (en) Split screen exits method, apparatus, storage medium and electronic equipment
CN107291326A (en) Icon processing method and terminal
CN106534528A (en) Processing method and device of text information and mobile terminal
CN106657254B (en) A kind of synchronous method of contact information, apparatus and system
CN107317917B (en) Application control method and Related product
CN109062643A (en) A kind of display interface method of adjustment, device and terminal
CN109062469A (en) Multi-screen display method, device, storage medium and electronic equipment
CN108459813A (en) A kind of searching method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170829