WO2015081485A1 - Procédé et dispositif permettant à un dispositif terminal d'identifier les gestes d'un utilisateur - Google Patents

Procédé et dispositif permettant à un dispositif terminal d'identifier les gestes d'un utilisateur Download PDF

Info

Publication number
WO2015081485A1
WO2015081485A1 PCT/CN2013/088389 CN2013088389W WO2015081485A1 WO 2015081485 A1 WO2015081485 A1 WO 2015081485A1 CN 2013088389 W CN2013088389 W CN 2013088389W WO 2015081485 A1 WO2015081485 A1 WO 2015081485A1
Authority
WO
WIPO (PCT)
Prior art keywords
program
sensor
terminal device
identification
sensing data
Prior art date
Application number
PCT/CN2013/088389
Other languages
English (en)
Chinese (zh)
Inventor
吴昊
楚庆
钟山
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201380005552.5A priority Critical patent/CN104169858B/zh
Priority to PCT/CN2013/088389 priority patent/WO2015081485A1/fr
Publication of WO2015081485A1 publication Critical patent/WO2015081485A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to the field of communications technologies, and in particular, to a method and device for a terminal device to recognize a user gesture.
  • BACKGROUND Currently, information interaction between users through a touch screen has become the mainstream of the terminal interaction mode.
  • mainstream smartphones and tablet computers in the market have touch screens.
  • users After years of training, users have gradually learned and are accustomed to human-computer interaction with terminal devices through gestures. Users can easily complete various tasks on the mobile phone through single-point, multi-touch, combined with various intuitive and vivid gestures.
  • the gesture recognition is completed by the touch screen to recognize the position and displacement of the touch point.
  • the main touch screens include: tapping a specific position of the screen with a finger and picking up (Tap); pressing the screen with a finger for a long time (Long press); tapping the screen with a finger and sliding it in a specific direction to pick up (Swipe); Two fingers are pressed on the screen and moved with two fingers facing outward or inward (Pinch); two fingers are pressed on the screen, two fingers are rotated and picked up (Rotate).
  • touch screen recognition cannot break the limitation that the gesture must touch the screen. When the screen or the user's hand is dirty, the accuracy and usability of gesture recognition will be reduced.
  • the limitations of using touch screens for gesture recognition techniques necessarily limit gesture design, affecting the richness, usability, and ease of use of gestures. There are more and more types of operations that users need to perform on the terminal, and the limitations of touch screen recognition make many more intuitive gestures unrecognizable.
  • CMOS infrared sensor that allows the external environment to be perceived regardless of the lighting conditions of the surrounding environment.
  • the sensor senses the environment through black and white spectroscopy: pure black represents infinity, and pure white represents infinity.
  • the gray area between black and white corresponds to the physical distance from the object to the sensor. It collects every point in the field of view and forms an image of the depth of field that represents the surrounding environment.
  • the sensor generates a depth of field image stream at a rate of 30 frames per second, and reproduces the surrounding environment in real time in 3D.
  • human gestures can be identified and operational gesture sets that do not conflict with human natural gestures can be created.
  • the recognition gesture technique is based on a living room scene, and the distance between the operator and the sensor is usually 10 inches or more, which is not applicable to the operation scene of the handheld terminal.
  • the sensor used in the gesture recognition is more complicated than the sensor used in the existing handheld terminal, and the power consumption is high, so it is not suitable for use in the handheld terminal.
  • Embodiments of the present invention provide a method and apparatus for a terminal device to recognize a user gesture to improve accuracy of a terminal for user gesture recognition.
  • a first aspect provides a method for a terminal device to recognize a user gesture, including:
  • the sensor of the terminal device acquires sensing data
  • the terminal device matches the sensing data with a pre-registered recognition mode
  • the terminal device acquires a response operation of the matching recognition mode
  • the terminal device performs the response operation.
  • the identifying mode includes: a recognition rule and the response operation
  • the identification rule includes: a single sensor identification rule and/or a multi-sensor identification rule; wherein, the single sensor identification rule is a rule for establishing one sensor data acquired by one sensor; the multi-sensor identification rule is corresponding to multiple sensor acquisition rules.
  • the terminal device matches the sensing data with a single sensor identification rule and/or a multi-sensor identification rule of the identification mode;
  • the sensing data is matched with one or more of the identification modes in the recognition mode, and specifically includes:
  • the sensing data conforms to a single sensor identification rule and/or a multi-sensor identification rule of the recognition mode.
  • the method further includes:
  • the terminal device turns on the sensor;
  • the method for the terminal device to enable the sensor includes:
  • the terminal device opens a program
  • the terminal device registers an identification mode corresponding to the program according to the program
  • the terminal device turns on the sensor according to the recognition mode.
  • the program that is opened by the terminal device performs the response operation of the recognition mode corresponding to the program
  • the response operation is an operation that the opened program can perform.
  • the program includes: a system level program and/or an application level program;
  • the program at the system level is a program that is simultaneously started when the terminal device is turned on; and the program at the application level is a program that is opened by the terminal device according to an input of a user;
  • the method further includes:
  • the terminal device closes the application level program
  • the terminal device deregisters the identification mode corresponding to the closed application level program; the terminal device closes the sensor corresponding to the identified recognition mode.
  • the terminal device registering an identification mode corresponding to the program, specifically: The terminal device registers an identification mode corresponding to the program in a program framework according to the program.
  • a terminal including:
  • a sensing data acquisition module configured to acquire sensing data by using a sensor
  • a matching module configured to match the sensing data with a pre-registered recognition mode
  • the terminal device acquires a response operation of the matching identification mode
  • Responding to an operation execution module configured to perform the response operation.
  • the identifying mode includes: a recognition rule and the response operation
  • the identification rule includes: a single sensor identification rule and/or a multi-sensor identification rule; wherein, the single sensor identification rule is a rule for establishing a sensor data acquired by one sensor;
  • the multi-sensor identification rule is a comprehensive rule established for a plurality of sensor data acquired by a plurality of sensors.
  • a first matching module configured to match the sensing data with a single sensor identification rule of the identification mode
  • a second matching module configured to match the sensing data with the multi-sensor identification rule of the identification mode.
  • the method further includes:
  • a sensor opening module configured to turn on the sensor
  • the sensor opening module specifically includes:
  • a program opening unit for opening a program
  • An identification mode registration unit configured to register an identification mode corresponding to the program according to the program; and a sensor on unit, configured to turn on the sensor according to the recognition mode.
  • the program includes: a system level program and/or an application level program;
  • the program at the system level is a program that is simultaneously started when the terminal device is turned on;
  • the application level program is a program that is opened by the terminal device according to a user input;
  • the terminal further includes:
  • a program closing module configured to close the application level program
  • An identification mode logout module configured to log out the identification mode corresponding to the closed application level program
  • a sensor shutdown module configured to close the sensor corresponding to the identification mode that is logged out.
  • a terminal including: a processor, a memory, a bus, and a sensor; the processor, the memory, and the sensor are connected to each other through the bus; the sensor is configured to acquire sensing data; Storing a computer to execute an instruction; when the terminal is running, the processor executes the computer-executed instruction stored in the memory to match the sensing data with a pre-registered recognition mode; if the sensing data is If one or more of the recognition modes match, the response operation of the matched recognition mode is acquired; and the response operation is performed.
  • the identifying mode includes: a recognition rule and the response operation;
  • the identification rule includes: a single sensor identification rule and/or a multi-sensor identification rule; wherein, the single sensor identification rule is a rule for establishing one sensor data acquired by one sensor; the multi-sensor identification rule is corresponding to multiple sensor acquisition rules.
  • the processor by using the processor, to match the sensing data with a pre-registered identification mode, specifically includes:
  • the processor matches the sensory data to a single sensor identification rule and/or a multi-sensor identification rule of the recognition mode;
  • the sensing data is matched with one or more of the identification modes in the recognition mode, and specifically includes:
  • the sensing data conforms to a single sensor identification rule and/or a multi-sensor identification rule of the recognition mode.
  • the processor is further configured to: enable the sensor;
  • the method for the processor to turn on the sensor includes:
  • the processor opens a program
  • the processor registers an identification mode corresponding to the program according to the program
  • the processor turns on the sensor according to the recognition mode.
  • the performing, by the processor, the responding operation includes:
  • the program that is turned on performs the response operation of the recognition mode corresponding to the program; wherein the response operation is an operation that the opened program can perform.
  • the program includes: a system level program and/or an application level program;
  • the program at the system level is a program that is simultaneously started when the terminal device is turned on; and the program at the application level is a program that is opened by the terminal device according to an input of a user;
  • the processor is further configured to:
  • the sensor corresponding to the identified mode that was logged out is closed.
  • the processor according to the program, registering an identification mode corresponding to the program, specifically:
  • the processor registers an identification mode corresponding to the program in a program framework according to the program.
  • a sensor is set in the terminal device, and a corresponding identification mode is pre-registered in the terminal.
  • the terminal device can transmit the gesture received by the sensor.
  • Sense data matching with a corresponding recognition mode of pre-registration, if the sensor data matches one or more of the recognition modes, the terminal may recognize the gesture of the current user, and according to the The gesture responds accordingly.
  • FIG. 1 is a flow chart showing a method for recognizing a user gesture according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of an empty gesture in an embodiment of the present invention.
  • FIG. 3 is a flowchart of another method for recognizing a user gesture according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of a method for a terminal device to turn on a sensor according to an embodiment of the present invention
  • FIG. 5 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of another terminal according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of still another terminal according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of software and hardware of a terminal according to an embodiment of the present invention.
  • Figure 9 is a flow chart showing gesture recognition based on the terminal of Figure 8.
  • FIG. 10 is a flowchart of a camera recognizing a user gesture according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of a terminal for recognizing a user gesture based on a computer system according to an embodiment of the present invention.
  • FIG. 1 an embodiment of a method for a user device to recognize a user gesture is provided by the method, where the method includes the following steps:
  • Step 101 The sensor of the terminal device acquires sensing data.
  • the terminal device is provided with various sensors capable of sensing user gestures.
  • the user uses the terminal, the user does not need to touch the screen with a finger, and only needs to make a corresponding gap gesture within the preset distance range before the terminal, so that the gestures are set by the sensors disposed inside the terminal, and the gestures are converted into corresponding Sensing data.
  • the gesture of the space gap can be as shown in FIG. 2, including: swinging, shooting, lifting, and pressing.
  • the operating system and application of the terminal can give different gestures to these gestures.
  • the meaning of the implementation based on the recognition and use of existing touch screen gestures to expand more sets of human-computer interaction gestures.
  • the sensor may be built in the hardware of the terminal; or, disposed outside the terminal, connected to the terminal through an interface disposed on the terminal.
  • the sensor may include a distance sensor, a light sensor, a camera, a gyroscope, a three-dimensional accelerometer, and the like.
  • Step 102 The terminal device matches the sensing data with a pre-registered identification mode.
  • the identification mode is pre-registered in the terminal device. By pre-registering different recognition modes, different gestures of the user that the terminal device can recognize are recorded.
  • the terminal device When the terminal device receives the sensor data corresponding to the acquired user gesture, the sensor data needs to be matched with the pre-registered recognition mode to determine whether the currently received user gesture is a gesture recognizable by the terminal.
  • Step 103 If the sensing data matches one or more of the identification modes of the identification mode, the terminal device acquires a response operation of the matching identification mode.
  • the terminal may identify a gesture of the current user, and further, obtain a corresponding response operation according to the recognized gesture; If the sensor data does not match the pre-registered recognition mode, the terminal cannot recognize the gesture of the current user, and further, does not need to respond to the gesture of the current user.
  • Step 104 The terminal device performs the response operation.
  • the specific operation according to the response operation may be Content, perform the appropriate action.
  • the corresponding operation is: according to the user gesture, the control terminal implements a corresponding adjustment function, for example: realizing the zoom of the camera according to the user gesture.
  • a sensor is set in the terminal device, and a corresponding recognition mode is pre-registered in the terminal.
  • the terminal device can transmit the gesture received by the sensor.
  • Sense data matching with a corresponding recognition mode of pre-registration, if the sensor data matches one or more of the recognition modes, the terminal may recognize the gesture of the current user, and according to the The gesture responds accordingly. It can be seen that since the user receives the gesture of the user through the sensor, the user does not need to touch the screen with the finger, so that the gesture operation of the user is not restricted by the necessity of touching the screen, and at the same time, the accuracy of the gesture recognition of the user by the terminal is greatly improved.
  • the identifying mode may specifically include: a recognition rule and the response operation.
  • the identification rule specifically includes: a single sensor identification rule and/or a multi-sensor identification rule.
  • the single sensor identification rule is a rule for establishing one sensor data acquired by one sensor
  • the multi-sensor identification rule is a comprehensive rule for establishing a plurality of sensor data acquired by a plurality of sensors.
  • the terminal device matches the sensing data with a pre-registered identification mode, and specifically includes: the terminal device uses the sensing data and the single sensor identification rule of the identification mode and/or Or multi-sensor identification rules to match;
  • the sensing data is matched with one or more of the identification modes of the identification mode, and specifically includes: the sensing data conforms to the single sensor identification rule and/or multi-sensor identification of the identification mode rule.
  • an identification mode may correspond to the opening of multiple sensors, as shown in Table 1.
  • MR003 is yes
  • MR004 is the identification rule corresponding to each sensor in the recognition mode MR001 shown in Table 2.
  • each recognition mode eg: MR001
  • MR001 multiple sub-modes in each recognition mode
  • the identification mode MR001 Taking the identification mode MR001 as an example, if the light sensor conforms to rule 1, and the distance sensor conforms to rule 1, it continues to determine which rule the front camera conforms to; if the front camera also conforms to rule 1, sub-pattern 1 is successfully matched; otherwise, the pattern matching fails. . If the light sensor complies with Rule 2 and the distance sensor complies with Rule 2, it continues to determine which rule the front camera meets; if the front camera also complies with Rule 2, Submode 2 matches successfully; otherwise, the pattern match fails.
  • the method before the sensor of the terminal device acquires the sensing data, the method further includes:
  • Step 105 The terminal device turns on the sensor.
  • the method for the terminal device to enable the sensor may include: Step 401: The terminal device starts a program.
  • Step 402 The terminal device registers an identification mode corresponding to the program according to the program.
  • Step 403 The terminal device turns on the sensor according to the identification mode.
  • the response operation of the recognition mode corresponding to the program is performed by the opened program of the terminal; wherein the response operation is executable by the opened program. operating.
  • the program specifically includes: a system-level program and an application-level program; wherein the system-level program is a program that is simultaneously started when the terminal device is turned on;
  • the application level program is a program that the terminal device turns on according to the user's input.
  • the terminal device may close the application-level program, and logout and the closed The identification mode corresponding to the application-level program, and finally, the terminal device turns off the sensor corresponding to the recognized recognition mode.
  • the terminal device registers an identification mode corresponding to the program in a program frame according to the program.
  • the application framework can be set inside the terminal device.
  • the application framework is placed on top of the terminal operating system. Programs at the system level of the terminal device and programs at the application level can interact with the application framework through which various functions of the two programs are invoked.
  • the present invention also provides a terminal.
  • FIG. 5 it is a schematic structural diagram of a terminal in an embodiment provided by the present invention.
  • the terminal specifically includes:
  • a sensing data acquiring module 501 configured to acquire sensing data by using a sensor
  • a matching module 502 configured to match the sensing data with a pre-registered identification mode
  • a response operation obtaining module 503 configured to: if the sensing data and the identification mode are one or more of the identification modes Matching, the terminal device acquires a response operation of the matched recognition mode
  • the response operation execution module 504 is configured to perform the response operation.
  • various sensors capable of sensing a user's gesture are provided.
  • the user uses the terminal, the user does not need to touch the screen with a finger, and only needs to make a corresponding gap gesture within the preset distance range before the terminal, so that the gestures are set by the sensors disposed inside the terminal, and the gestures are converted into corresponding Sensing data.
  • the sensor may be built in the hardware of the terminal; or, disposed outside the terminal, connected to the terminal through an interface disposed on the terminal.
  • the sensor may include: a distance sensor, a light sensor, a camera, a gyroscope, a three-dimensional accelerometer, and the like.
  • the identification mode is pre-registered inside the terminal device.
  • the sensing data acquisition module acquires the sensing data
  • the matching data is matched with the pre-registered identification mode by the matching module to determine whether the currently received user gesture is a gesture recognizable by the terminal. If the sensing data matches one or more of the recognition modes, the terminal may identify a gesture of the current user, and further, according to the recognized gesture, obtain a corresponding response by responding to the operation acquiring module. operating. Finally, by responding to the operation execution module, the corresponding operation is performed according to the specific content of the response operation.
  • a sensor is set in the terminal device, and a corresponding recognition mode is pre-registered in the terminal.
  • the terminal device can transmit the gesture received by the sensor.
  • Sense data matching with a corresponding recognition mode of pre-registration, if the sensor data matches one or more of the recognition modes, the terminal may recognize the gesture of the current user, and according to the The gesture responds accordingly. It can be seen that since the user receives the gesture of the user through the sensor, the user does not need to touch the screen with the finger, so that the gesture operation of the user is not restricted by the necessity of touching the screen, and at the same time, the accuracy of the gesture recognition of the user by the terminal is greatly improved.
  • the identification mode may specifically include: an identification rule and the response operation.
  • the identification rule specifically includes: a single sensor identification rule and/or a multi-sensor identification rule.
  • the single sensor identification rule is a rule for establishing one sensor data acquired by one sensor
  • the multi-sensor identification rule is a comprehensive rule for establishing a plurality of sensor data acquired by a plurality of sensors.
  • an identification mode may correspond to the opening of one or more sensors. Therefore, the matching module may specifically include:
  • a first matching module configured to match the sensing data with a single sensor identification rule of the identification mode
  • a second matching module configured to match the sensing data with the multi-sensor identification rule of the identification mode.
  • the terminal may further include:
  • a sensor opening module 505 is used to turn on the sensor.
  • Controlling the opening of the sensor is achieved by setting the sensor to open the module. Only after the sensor on module is turned on, the sensor set in the terminal starts to collect the user's gesture.
  • the opening module may include:
  • a program opening unit for opening a program
  • An identification mode registration unit configured to register an identification mode corresponding to the program according to the program
  • a sensor on unit configured to turn on the sensor according to the recognition mode
  • the response operation of the recognition mode corresponding to the program is executed by the opened program of the terminal; wherein the response operation is an operation that the opened program can perform.
  • the program specifically includes: a system level program and an application level program; wherein the system level program is a program that is simultaneously started when the terminal device is turned on;
  • the application level program is a program that the terminal device turns on according to the user's input.
  • the terminal device registers an identification mode corresponding to the program in an internally set program framework according to the program.
  • a program shutdown module 506 can be further disposed in the terminal, for closing the application-level program;
  • the identification mode cancellation module 507 is configured to cancel the identification mode corresponding to the program of the application level that is closed;
  • a sensor shutdown module 508 is configured to turn off the sensor corresponding to the identified mode that was logged out.
  • the hardware sensor may include: a distance sensor, a light sensor, a camera, a gyroscope, a three-dimensional accelerometer, etc.; the sensor driver is a sensor driver included in the bottom layer of the OS (Operation System), and is used to interpret the sensor return. Data, and converted into a data format that can be recognized by the upper application; the application framework, preset with gesture mode data, after collecting sensor data generated by the sensor driver, according to system settings and system program or application registration, identification Whether the current sensor data conforms to a specific recognition mode.
  • OS Operaation System
  • the application framework When the sensor data conforms to the corresponding recognition mode, the application framework will send a gesture event to the application or system program registered with the mode; the system program is a system-level application of the resident terminal memory.
  • the program the system program starts automatically when it is turned on, and registers the corresponding gesture event to the application framework when it is turned on; the application is a non-resident general application, which is usually manually started by the user.
  • the application framework registered a corresponding gesture event.
  • Step 901 After collecting, by each hardware sensor, sensor data corresponding to the user gesture, sending the data to the sensor driver;
  • Step 902 The sensor driver selects and rejects part of the useless sensing data according to the accuracy requirement set by the system;
  • Step 904 The application framework performs corresponding mode identification matching according to the registration status of the system program and the application program;
  • Step 905 After the pattern recognition is matched, the application framework sends the identified gesture event to the corresponding system program or application.
  • Step 906 The corresponding system program or application performs a response operation according to the received gesture event.
  • system programs and applications need to be registered with the application framework to receive the appropriate gesture events.
  • the application registration is used as an example to illustrate the process of registering an application in the application framework.
  • the application When the application is first launched, the application will register the relevant gesture mode in the application framework.
  • the data application framework needs to check if the sensor is turned on. If the sensor is not turned on, turn on the sensor; when it is detected that the required sensor is already open After the gesture mode data is successfully registered, the application framework sends a corresponding feedback message to the registered application.
  • the application framework checks if an open sensor is not being used, and if so, issues an instruction to the sensor driver to turn off the sensor. .
  • the program on which registration is completed can be stored as a program registration list.
  • the list is shown in Table 4 below.
  • MR001... represents different gesture modes for the program, respectively.
  • the application framework identifies the corresponding In the gesture mode, all the programs registered in the mode are found according to the list, and therefore, the gesture event can be sent to the respective programs.
  • Different sensors may be required for different gesture modes.
  • the following is an example in which the user zooms the camera on the terminal by gestures, as shown in FIG. 10, and illustrates the entire process of recognizing the user's gesture.
  • Step 1001 A user opens a camera program.
  • Step 1002 The camera program registers gesture zoom mode data in an application framework
  • Step 1003 The application framework issues a startup instruction to the related sensor driver.
  • Step 1004 the related sensor driver turns on the corresponding sensor
  • Step 1005 The related sensor driver sends a feedback message that the sensor startup success is sent to the application framework.
  • Step 1006 The application framework sends a feedback message of successful registration to the camera program.
  • Step 1007 The user uses a zoom out gesture in front of the camera;
  • Step 1008 The corresponding sensor sends a user gesture operation signal to the sensor driver.
  • Step 1009 The sensor driver performs finishing of the signals sent by the sensors, and transmits the signals to the application frame.
  • Step 1010 The application framework matches the sensor data with the preset gesture mode data, and successfully identifies the gesture of the user.
  • Step 1011 The application framework feeds the recognition result to the camera program
  • Step 1012 The camera program feeds back the gesture of the user by changing the display interface.
  • the present invention further provides a terminal for recognizing a user gesture based on a computer system.
  • the terminal in the embodiment of the present invention may include: a processor 1101, a memory 1102, a bus 1103, and a sensor 1104;
  • the processor 1101, the memory 1102, and the sensor 1104 are connected to each other through the bus 1103;
  • the sensor 1103 is configured to acquire sensing data;
  • the memory 1102 is configured to store computer execution instructions; when the terminal is running, the processor 1101 executing the computer execution instruction stored by the memory 1102, so that the terminal performs the following operations: matching the sensing data with a pre-registered recognition mode; if the sensing data is in the identification mode If one or more of the recognition patterns match, a response operation of the matched recognition mode is acquired; and the response operation is performed.
  • the identification mode includes: a recognition rule and the response operation; wherein the identification rule includes: a single sensor identification rule and/or a multi-sensor identification rule; wherein, the single sensor identification rule is A rule established by a sensor data acquired by one sensor; a multi-sensor identification rule is a comprehensive rule established for a plurality of sensor data acquired by a plurality of sensors.
  • the processor matching the sensing data with a pre-registered identification mode specifically: the processor, the sensor data and the single sensor identification rule of the identification mode and/or Multi-sensor identification rules are matched;
  • the sensing data is matched with one or more of the identification modes in the recognition mode, and specifically includes:
  • the sensing data conforms to a single sensor identification rule and/or a multi-sensor identification rule of the recognition mode.
  • the processor is further configured to: turn on the sensor.
  • the method for the processor to turn on the sensor includes:
  • the processor opens a program
  • the processor registers an identification mode corresponding to the program according to the program
  • the processor turns on the sensor according to the recognition mode.
  • the processor performing the response operation, specifically comprising: the opened program, performing the response operation of the recognition mode corresponding to the program; wherein the response operation is the opened program.
  • the program includes: a system level program and/or an application level program.
  • the system level program is a program that is simultaneously started when the terminal device is turned on; and the application level program is a program that is opened by the terminal device according to a user input.
  • the processor is further configured to:
  • the sensor corresponding to the identified mode that was logged out is closed.
  • the processor according to the program, registering an identification mode corresponding to the program, specifically: the processor, according to the program, registering an identification mode corresponding to the program in a program frame.
  • the processor may be a central processing unit (CPU), an application-specific integrated circuit (ASIC), or the like.
  • the computer storage medium may store a program, which may include some or all of the steps in various embodiments of the data transmission method provided by the embodiments of the present invention.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a random access memory (Random Access Memory). RAM) and so on.
  • the disclosed systems, devices, and methods may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, i.e., may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) or a processor to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, and the like, which can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé et un dispositif qui permettent à un dispositif terminal d'identifier les gestes d'un utilisateur, et qui améliorent la précision avec laquelle le terminal identifie les gestes de l'utilisateur. Selon ledit procédé : un capteur d'un dispositif terminal obtient des données de détection ; le dispositif terminal met les données de détection en correspondance avec des modes d'identification préenregistrés ; si les données de détection correspondent à un ou plusieurs des modes d'identification, le dispositif terminal obtient une opération de réponse du ou des modes d'identification qui correspondent ; et le dispositif terminal exécute l'opération de réponse.
PCT/CN2013/088389 2013-12-03 2013-12-03 Procédé et dispositif permettant à un dispositif terminal d'identifier les gestes d'un utilisateur WO2015081485A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201380005552.5A CN104169858B (zh) 2013-12-03 2013-12-03 一种终端设备识别用户手势的方法和设备
PCT/CN2013/088389 WO2015081485A1 (fr) 2013-12-03 2013-12-03 Procédé et dispositif permettant à un dispositif terminal d'identifier les gestes d'un utilisateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/088389 WO2015081485A1 (fr) 2013-12-03 2013-12-03 Procédé et dispositif permettant à un dispositif terminal d'identifier les gestes d'un utilisateur

Publications (1)

Publication Number Publication Date
WO2015081485A1 true WO2015081485A1 (fr) 2015-06-11

Family

ID=51912340

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/088389 WO2015081485A1 (fr) 2013-12-03 2013-12-03 Procédé et dispositif permettant à un dispositif terminal d'identifier les gestes d'un utilisateur

Country Status (2)

Country Link
CN (1) CN104169858B (fr)
WO (1) WO2015081485A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536565B (zh) * 2014-12-18 2019-01-11 深圳市酷商时代科技有限公司 应用程序控制方法和装置
CN106650346A (zh) * 2015-10-29 2017-05-10 阿里巴巴集团控股有限公司 一种密码输入的方法与设备
CN110609751B (zh) * 2018-06-14 2024-01-23 珠海市魅族科技有限公司 一种终端设备控制方法及装置、终端设备及计算机可读存储介质
CN110618874B (zh) * 2018-06-20 2023-11-07 珠海市魅族科技有限公司 一种终端设备控制方法及装置、终端设备及计算机可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662462A (zh) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 电子装置、手势识别方法及手势应用方法
CN103067630A (zh) * 2012-12-26 2013-04-24 刘义柏 一种通过手机的手势动作产生无线控制指令的方法
CN103226386A (zh) * 2013-03-13 2013-07-31 广东欧珀移动通信有限公司 一种基于移动终端的手势识别方法及系统
CN103279714A (zh) * 2013-06-19 2013-09-04 深圳市中兴移动通信有限公司 移动终端及其数据加密方法和解密方法
CN103294201A (zh) * 2013-06-27 2013-09-11 深圳市中兴移动通信有限公司 移动终端及其手势操控方法
CN103399633A (zh) * 2013-07-17 2013-11-20 北京小米科技有限责任公司 一种无线遥控方法及移动终端

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2502087A (en) * 2012-05-16 2013-11-20 St Microelectronics Res & Dev Gesture recognition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662462A (zh) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 电子装置、手势识别方法及手势应用方法
CN103067630A (zh) * 2012-12-26 2013-04-24 刘义柏 一种通过手机的手势动作产生无线控制指令的方法
CN103226386A (zh) * 2013-03-13 2013-07-31 广东欧珀移动通信有限公司 一种基于移动终端的手势识别方法及系统
CN103279714A (zh) * 2013-06-19 2013-09-04 深圳市中兴移动通信有限公司 移动终端及其数据加密方法和解密方法
CN103294201A (zh) * 2013-06-27 2013-09-11 深圳市中兴移动通信有限公司 移动终端及其手势操控方法
CN103399633A (zh) * 2013-07-17 2013-11-20 北京小米科技有限责任公司 一种无线遥控方法及移动终端

Also Published As

Publication number Publication date
CN104169858A (zh) 2014-11-26
CN104169858B (zh) 2017-04-26

Similar Documents

Publication Publication Date Title
US11513608B2 (en) Apparatus, method and recording medium for controlling user interface using input image
JP5790238B2 (ja) 情報処理装置、情報処理方法及びプログラム
KR102206054B1 (ko) 지문 처리 방법 및 그 전자 장치
CN105814522B (zh) 基于运动识别来显示虚拟输入设备的用户界面的设备和方法
RU2662690C2 (ru) Устройство и способ управления объектом пользовательского прибора
US9043502B1 (en) Portable computing device as control mechanism
KR20200101207A (ko) 복수의 카메라들을 이용하여 이미지의 배율을 변경하기 위한 전자 장치 및 방법
US10860857B2 (en) Method for generating video thumbnail on electronic device, and electronic device
WO2015161653A1 (fr) Procédé d'exploitation de terminal et dispositif terminal
KR20150007799A (ko) 영상 디스플레이를 제어하는 전자 장치 및 방법
KR102521192B1 (ko) 전자 장치 및 그의 동작 방법
EP2887648B1 (fr) Procédé de réalisation d'une prévisualisation et dispositif électronique pour la mise en oeuvre de ce procédé
JP2019128961A (ja) 指紋認識のための方法、電子装置及び格納媒体
US20190294652A1 (en) Electronic device and operation method thereof
EP4024839A1 (fr) Procédé d'opération et dispositif électronique
WO2020078234A1 (fr) Procédé de commande d'affichage et terminal
WO2017035818A1 (fr) Procédé de commande d'un appareil électronique, et dispositif et appareil électronique utilisant ce procédé
WO2015081485A1 (fr) Procédé et dispositif permettant à un dispositif terminal d'identifier les gestes d'un utilisateur
WO2019091124A1 (fr) Procédé d'affichage d'interface utilisateur de terminal et terminal
CN103207678A (zh) 一种电子设备及其解锁方法
US20160147294A1 (en) Apparatus and Method for Recognizing Motion in Spatial Interaction
WO2014190597A1 (fr) Procédé et système associé pour commander à distance un terminal de télévision numérique par l'intermédiaire d'un terminal mobile
EP3349098B1 (fr) Dispositif électronique et son procédé de fonctionnement
CN109542315B (zh) 移动终端的控制方法及系统
US20160086608A1 (en) Electronic device, method and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13898885

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13898885

Country of ref document: EP

Kind code of ref document: A1