CN106775377B - Gesture recognition device, equipment and control method of gesture recognition device - Google Patents

Gesture recognition device, equipment and control method of gesture recognition device Download PDF

Info

Publication number
CN106775377B
CN106775377B CN201611048931.4A CN201611048931A CN106775377B CN 106775377 B CN106775377 B CN 106775377B CN 201611048931 A CN201611048931 A CN 201611048931A CN 106775377 B CN106775377 B CN 106775377B
Authority
CN
China
Prior art keywords
gesture
graph
preset
module
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611048931.4A
Other languages
Chinese (zh)
Other versions
CN106775377A (en
Inventor
刘华一君
吴珂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201611048931.4A priority Critical patent/CN106775377B/en
Publication of CN106775377A publication Critical patent/CN106775377A/en
Application granted granted Critical
Publication of CN106775377B publication Critical patent/CN106775377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a gesture recognition apparatus, a device, and a control method of the gesture recognition apparatus. The gesture recognition apparatus includes: a gesture recognition module configured to recognize a current gesture pattern of a user. The gesture storage module is configured to store at least one preset gesture graph, and the preset gesture graph is the gesture graph and control information of corresponding equipment application controlled by the gesture graph. And the gesture comparison module is configured to compare the current gesture graph identified by the gesture identification module with the preset gesture graphs stored in the gesture storage module one by one. And the gesture response module runs and/or switches the corresponding equipment application of the current gesture graph according to the comparison result of the gesture comparison module. The method can enrich the control mode of the equipment and improve the user experience.

Description

Gesture recognition device, equipment and control method of gesture recognition device
Technical Field
The disclosure belongs to the field of computers, and relates to a gesture recognition device, equipment and a control method of the gesture recognition device.
Background
In the related art, with the development of science and technology, computer technology has grown mature. In the existing notebook computer, technologies such as fingerprint recognition, face recognition and the like are increasingly applied. And along with touch screen notebook is more and more popularized, fingerprint unblock, face identification unblock technique receive everybody's favor more and more.
When a user operates and/or switches functions such as an operating system or an application program on a computer, the user still needs to manually select and operate the corresponding functions such as the operating system or the application program, and the selection mode is more traditional. The control mode of the computer application can be richer.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a gesture recognition apparatus, a device, and a control method of the gesture recognition apparatus. The control mode of the equipment is enriched, and the user experience is improved.
According to a first aspect of the embodiments of the present disclosure, there is provided a gesture recognition apparatus, including:
a gesture recognition module configured to recognize a current gesture pattern of a user;
the gesture storage module is configured to store at least one preset gesture graph, and the preset gesture graph is a gesture graph and control information corresponding to the gesture graph and used for controlling corresponding equipment application;
the gesture comparison module is configured to compare the current gesture graph identified by the gesture identification module with the preset gesture graphs stored by the gesture storage module one by one;
and the gesture response module responds to the corresponding equipment application of the current gesture graph according to the comparison result of the gesture comparison module.
In an embodiment, the gesture comparison module compares the current gesture pattern with the preset gesture pattern one by one, wherein:
when the current gesture graph is matched with one of the preset gesture graphs, the matched preset gesture graph is sent to the gesture response module;
and when the current gesture graph is not matched with the preset gesture graph, the equipment does not respond.
In one embodiment, the gesture recognition apparatus further comprises:
and the gesture input module is configured to input a gesture graph and correlate at least one input gesture graph with the corresponding equipment application to be controlled to form the preset gesture graph.
In an embodiment, the gesture entry module comprises:
an input sub-module configured to receive a gesture graph input by the gesture recognition module;
the pairing submodule is configured to correlate the at least one gesture graph received by the input submodule and the corresponding equipment application to generate a preset gesture graph, and send the generated preset gesture graph to the gesture storage module.
In an embodiment, the preset gesture graphically controls the operation and/or switching of a corresponding user operating system or application.
According to a second aspect of the embodiments of the present disclosure, there is provided a control method of a gesture recognition apparatus, including:
recognizing a current gesture graph of a user;
comparing the recognized current gesture graph with a stored preset gesture graph one by one, wherein the preset gesture graph is a gesture graph and control information corresponding to the gesture graph and controlling corresponding equipment application;
and responding to the corresponding equipment application of the current gesture graph according to the comparison result.
In one embodiment, the current gesture pattern is compared with the preset gesture pattern one by one, wherein:
when the current gesture graph is matched with one preset gesture graph, sending the matched preset gesture graph;
when the current gesture graph is not matched with the preset gesture graph, the equipment does not respond.
In one embodiment, the method further comprises:
inputting gesture graphs, and correlating at least one input gesture graph with corresponding equipment application to be controlled to form the preset gesture graph.
In one embodiment, entering the gesture graphic includes:
receiving an input gesture graph;
and correlating at least one received gesture graph with corresponding equipment application to generate the preset gesture graph, and outputting and storing the generated preset gesture graph.
In an embodiment, the preset gesture graphically controls the operation and/or switching of a corresponding user operating system or application.
In one embodiment, any user operating system or application may be associated with at least one predetermined gesture pattern.
According to a third aspect of embodiments of the present disclosure, there is provided a gesture recognition apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: recognizing a current gesture graph of a user; comparing the recognized current gesture graph with a stored preset gesture graph one by one, wherein the preset gesture graph is a gesture graph and control information applied by corresponding equipment controlled by the gesture graph; and responding to the corresponding equipment application of the current gesture graph according to the comparison result.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the current gesture information input on the equipment controls the running and/or switching among different applications, so that the user experience is good, the response speed of the equipment is high, and the running efficiency is high. The preset gesture graph is stored in the device to form a database, and the corresponding gesture graph can be set according to habits of users, so that the operation is flexible, and the use is convenient.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
Fig. 1 is a block diagram illustrating a control process of a gesture recognition apparatus according to an exemplary embodiment of the present disclosure;
FIG. 2 is a block diagram illustrating an entry and storage process of a gesture recognition device according to an exemplary embodiment of the present disclosure;
FIG. 3 is a control flow diagram of a gesture recognition apparatus shown in an exemplary embodiment of the present disclosure;
FIG. 4 is an entry and storage flow diagram of a gesture recognition device shown in an exemplary embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating control and determination of a gesture recognition apparatus according to an exemplary embodiment of the present disclosure;
fig. 6 is a block diagram illustrating an apparatus in an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
As shown in fig. 1, according to an exemplary embodiment, a gesture recognition apparatus is shown, in which a gesture recognition module 10, a gesture storage module 20, a gesture comparison module 30 and a gesture response module 40 are disposed.
The gesture recognition module 10 is configured to recognize a current gesture graph of a user, input a motion trajectory of a gesture on the gesture recognition module 10, and the gesture recognition module 10 forms the current gesture graph according to the motion trajectory of the gesture. The gesture storage module 20 is configured to store at least one preset gesture pattern, where the preset gesture pattern is a gesture pattern and control information of a corresponding device application controlled by the gesture pattern. At least one preset gesture pattern is stored in the gesture storage module 20, and the applications or operating systems that the device needs to control and respond to through the gesture pattern are associated with each other to form a database of the preset gesture pattern.
The gesture comparison module 30 is configured to compare the current gesture pattern recognized by the gesture recognition module 10 with the preset gesture pattern stored in the gesture storage module 20 one by one. The gesture recognition module 10 recognizes and transmits the valid current gesture pattern to the gesture comparison module 30, and the gesture comparison module 30 reads the preset gesture pattern in the gesture storage module 20 and compares the current gesture pattern with the preset gesture pattern one by one. When the current gesture graph matches the preset gesture graph, the gesture comparison module 30 sends the comparison result to the gesture response module 40, the gesture response module 40 obtains control information corresponding to the current gesture graph, and the gesture response module 40 enables the corresponding device application to respond according to the control information.
For example, the response of the respective device application may include running and/or switching the respective user operating system or application program. When the control information corresponding to the current gesture graph is used for switching between two applications on the device, the gesture response module 40 can control the two corresponding device applications to perform switching after reading the control information. When the control information corresponding to the current gesture graph is that an operating system enters or runs on the device, after the gesture response module 40 reads the control information, the device can be controlled and the operating system can enter or run. That is, the gesture response module 40 may control the response function of the device to operate according to the control information carried in the preset gesture pattern.
The gesture recognition module 10 of the gesture graph is arranged on the device, a user can set and match the corresponding gesture graph and the equipment function according to own habits or hobbies, the gesture graph can be input on the gesture recognition module 10, the operation effect of the corresponding equipment function can be controlled, operation is convenient, and user experience is good.
An entry and storage process schematic diagram of a gesture recognition device is shown according to an exemplary embodiment; as shown in fig. 2, the gesture recognition apparatus further includes a gesture entry module 50, and the gesture entry module 50 is configured to enter a gesture graphic and associate the entered at least one gesture graphic with a corresponding device application to be controlled to form a preset gesture graphic. The gesture recognition module 10 is connected to the gesture entry module 50, and in an initial state, a preset gesture pattern is entered into the device, and the gesture recognition module 10 sends the recognized gesture pattern to the gesture entry module 50. The gesture entry module 50 associates the received gesture graphics with the corresponding device application that the user needs to be associated with, and controls the corresponding device application through the gesture graphics. The device may respond to the gesture graphic, run and/or switch the corresponding application or operating system. The gesture graphic input and matching process can be matched in a program arranged in the equipment, and can be visually displayed on a display screen of the equipment, so that the method is good in intuition and convenient to operate.
In the above embodiment, the gesture entry module 50 includes: an input submodule 51 and a pairing submodule 52. Wherein the input sub-module 51 is configured to receive the gesture pattern input by the gesture recognition module 10. The gesture recognition module 10 recognizes the gesture pattern input by the user and delivers the gesture pattern to the input sub-module 51. The input sub-module 51 may determine whether the entered gesture pattern already exists, and issue a prompt that the entered gesture pattern already exists when the entered gesture pattern already exists. When the entered gesture pattern does not exist, the input sub-module 51 receives the gesture pattern and transmits the gesture pattern to the pairing sub-module 52. The pairing submodule 52 is configured to associate at least one gesture pattern with a corresponding device application to generate a preset gesture pattern, and send the generated preset gesture pattern to the gesture storage module 20. Correlating the gesture graphics with the respective device applications in the pairing submodule 52 may be facilitated by adjustments made in the software of the device. When the matched application and gesture graph need to be adjusted, the corresponding gesture graph and the corresponding device application can be associated again on the pairing sub-module 52, and adjustment is convenient.
The input submodule 51 and the matching submodule 52 are arranged to associate and match the application of the device with the gesture graph, so that the operation is convenient, the matching process is convenient, and the user experience is good.
In an exemplary embodiment, the gesture comparison module 30 compares the current gesture pattern with the preset gesture pattern one by one. The gesture recognition module 10 transmits the recognized gesture pattern to the gesture comparison module 30, and the gesture recognition module 10 connects and retrieves the database of the preset gesture pattern in the gesture storage module 20, and compares the current gesture pattern with the preset gesture pattern. Wherein: when the current gesture pattern matches one of the preset gesture patterns, the matched preset gesture pattern is sent to the gesture response module 40. When the current gesture graph is not matched with the preset gesture graph, the equipment does not respond.
When the current gesture graph is matched with one of the preset gesture graphs, the control information of the corresponding device application carried in the preset gesture graph can be called from the gesture storage module 20, the corresponding device application is operated after the gesture response module 40 receives the control information, the corresponding device application is operated and responded or switched through gesture graph control, the user experience is good, and the operation is convenient.
And the preset gesture graph controls the operation and/or the switching of a corresponding user operating system or application program. The preset gesture graph may carry control information of a corresponding device application, and the corresponding device application may include functions of controlling to enter an operating system of the device, or enter a corresponding application program, or switch between two user operating systems, or switch between two application programs, or switch between a user operating system and an application program, and the like.
The user can set up the gesture figure according to individual custom and hobby and control corresponding equipment application, the simple operation, corresponding equipment application can quick response, and user experience is good.
Any user operating system or application may be associated with at least one predetermined gesture graphic. When a user sets a gesture graph associated with a user operating system or an application program, a plurality of preset gesture graphs can be set in the same user operating system or the same user application program to be associated and responded with according to needs or habits of different users. When the equipment is used by different users, the equipment can enter a user operating system or an application program according to the setting of the user, the operation is convenient, and the user experience is good.
In an exemplary embodiment, the gesture recognition module 10 is a display with touch screen functionality. The device with the touch screen function can directly recognize gesture graphs on the display, and the operation is visual and convenient.
In an exemplary embodiment, the gesture recognition module 10 is a gesture recognition panel disposed on a keyboard or hardware of the device. The gesture recognition panel is arranged on the equipment and used for recognizing the gesture graph of the user, the performance requirement on the equipment is lowered, the added gesture recognition panel can improve the convenience of the user in running and switching the operating system or the application program, and the user experience is good. The gesture recognition panel can directly input gesture graphs and enter a corresponding operating system or application program, the concealment is good, and the privacy of a user is guaranteed.
As shown in fig. 3, a control method of a gesture recognition apparatus according to an exemplary embodiment is shown, the control method including the steps of:
in step 101, a current gesture pattern of a user is identified. A current gesture pattern input by a user on the device is identified, the gesture pattern representing an operating system or application that the user needs to run and/or switch.
In step 102, at least one preset gesture pattern is stored, where the preset gesture pattern is a gesture pattern and control information of a device application controlled by the gesture pattern. The device is stored with a database formed by at least one gesture graph, and the gesture graph in the database carries control information for controlling corresponding device application. The preset gesture information is stored in the equipment and used for calling and controlling the running of corresponding equipment application.
In step 103, the recognized current gesture pattern is compared with the stored preset gesture pattern one by one. And comparing the current gesture graph input by the user on the equipment with a preset gesture graph stored in the equipment, and calling control information of corresponding equipment application carried in the preset gesture graph according to a comparison result.
And step 104, responding to the corresponding equipment application of the current gesture graph according to the comparison result. And acquiring control information of corresponding equipment application according to the comparison result, and controlling a corresponding function of the equipment, such as an operating system or an application program which is operated and/or switched, according to the acquired control information.
The preset gesture graph is stored in the equipment, the current gesture graph is input into the equipment, and the preset gesture graph is matched according to the current gesture graph, so that the corresponding function of the equipment is controlled, the operation is convenient, the function operation and/or switching on the equipment is convenient, and the user experience is good. The user can set up the gesture figure according to the custom and the demand of self, convenient operation.
As shown in fig. 4, in the above embodiment, the control method further includes: inputting gesture graphs, and associating the input at least one gesture graph with corresponding equipment application to be controlled to form preset gesture graphs. The gesture recognition device is used for collecting, associating and pre-storing corresponding device applications matched with the gesture patterns, and storing the preset gesture patterns in the devices to form a database. The control functions of the user to the corresponding equipment application are all associated, prerecorded and modified in the step, and the operation is convenient.
Further subdividing the input gesture graph, wherein the input gesture graph comprises the following steps:
in step 201, an input gesture pattern is received. And receiving the gesture graph identified on the equipment when the equipment is in the recorded working state, and judging whether the gesture graph exists or not. When the gesture graphic already exists, a prompt that the input gesture graphic already exists is sent. When the entered gesture graphic does not exist, the gesture is sent to the next step.
In steps 202 and 203, at least one gesture graph and the corresponding device application are associated with each other to generate a preset gesture graph, and the generated preset gesture graph is output and stored. And associating at least one gesture graph on the corresponding equipment application required by the user, storing the matched preset gesture graph in the equipment, and inputting the gesture graph on the equipment to control the function operation and/or switching on the corresponding equipment. The user operation is convenient, the corresponding equipment application can be correspondingly associated by presetting the gesture graph, the control method is rich, and the user experience is good.
As shown in fig. 5, step 103 is further subdivided:
in step 301, the current gesture pattern is compared with the preset gesture pattern one by one, wherein:
in steps 302 and 303, when the current gesture pattern matches one of the preset gesture patterns, the matched preset gesture pattern is sent. The control information of the corresponding device application carried in the matched preset gesture pattern is transmitted to the gesture response module 40, so as to control the operation of the corresponding function in the device.
In steps 302 and 304, the device does not respond when the current gesture pattern does not match the preset gesture pattern. And when the graph formed by the current gesture graph is not matched with the gesture graph prestored in the database of the equipment, the current gesture graph is an invalid gesture graph, and the equipment is maintained in the current state.
And comparing the current gesture graph with the preset gesture graph, and controlling the operation of the corresponding function on the equipment through the current gesture graph. The comparison efficiency is high, and the application operation is stable.
And the preset gesture graph controls the operation and/or the switching of a corresponding user operating system or application program. The operation of the application on the equipment is controlled through the gesture graph, the operation is convenient, the steps of selecting and searching the corresponding function and the application by a user are saved, and the user experience is good.
Any user operating system or application may be associated with at least one predetermined gesture graphic. Different preset gesture graphs can be set on the same operating system or application program according to the preference and habit of different users, the same preset gesture graph can also be adopted, and the setting is flexible and convenient.
A gesture recognition apparatus according to an exemplary embodiment is shown, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: recognizing a current gesture graph of a user; comparing the recognized current gesture graph with a stored preset gesture graph one by one, wherein the preset gesture graph is a gesture graph and control information applied by corresponding equipment controlled by the gesture graph; and responding to the corresponding equipment application of the current gesture graph according to the comparison result.
FIG. 6 is a block diagram illustrating a gesture recognition device 60 according to an example embodiment. For example, the device 60 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 6, device 60 may include one or more of the following components: processing component 602, memory 604, power component 606, multimedia component 608, audio component 610, input/output (I/O) interface 612, sensor component 614, and communication component 616.
The processing component 602 generally controls overall operation of the device 60, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing elements 602 may include one or more processors 620 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 602 can include one or more modules that facilitate interaction between the processing component 602 and other components. For example, the processing component 602 can include a multimedia module to facilitate interaction between the multimedia component 608 and the processing component 602.
Memory 604 is configured to store various types of data to support operation at device 60. Examples of such data include instructions for any application or method operating on device 60, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 604 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 606 provides power to the various components of device 60. Power components 606 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 60.
The multimedia component 608 includes a screen that provides an output interface between the device 60 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 608 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 60 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 610 is configured to output and/or input audio signals. For example, audio component 610 includes a Microphone (MIC) configured to receive external audio signals when device 60 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 604 or transmitted via the communication component 616. In some embodiments, audio component 610 further includes a speaker for outputting audio signals.
The I/O interface 612 provides an interface between the processing component 602 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 614 includes one or more sensors for providing various aspects of status assessment for the device 60. For example, the sensor component 614 may detect the open/closed status of the device 60, the relative positioning of the components, such as the display and keypad of the device 60, the sensor component 614 may also detect a change in the position of the device 60 or a component of the device 60, the presence or absence of user contact with the device 60, the orientation or acceleration/deceleration of the device 60, and a change in the temperature of the device 60. The sensor assembly 614 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 614 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
Communication component 616 is configured to facilitate communications between device 60 and other devices in a wired or wireless manner. The device 60 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 616 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 616 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the device 60 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as the memory 604 including instructions executable by the processor 620 of the device 60 to perform the above-described method. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The above description is only exemplary of the present disclosure and should not be taken as limiting the disclosure, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (12)

1. A gesture recognition apparatus applied to a computer, the computer comprising a display, a device keyboard and hardware, the gesture recognition apparatus comprising:
a gesture recognition module configured to recognize a current gesture pattern of a user; the gesture recognition module is arranged as a display with a touch screen function; or the gesture recognition module is a gesture recognition panel arranged on a keyboard or hardware of the equipment;
the gesture storage module is configured to store at least one preset gesture graph, the preset gesture graph is a gesture graph and control information corresponding to the gesture graph and used for controlling corresponding equipment application, the at least one preset gesture graph is matched with a user using the computer, the number of the users of the computer comprises two or more, and the preset gesture graph controls operation and/or switches corresponding user operation systems;
the gesture comparison module is configured to compare the current gesture graph identified by the gesture identification module with the preset gesture graphs stored by the gesture storage module one by one;
and the gesture response module runs and/or switches the corresponding equipment application of the current gesture graph according to the comparison result of the gesture comparison module.
2. The gesture recognition device of claim 1, wherein the gesture comparison module compares the current gesture pattern with the preset gesture pattern one by one, wherein:
when the current gesture graph is matched with one of the preset gesture graphs, the matched preset gesture graph is sent to the gesture response module;
and when the current gesture graph is not matched with the preset gesture graph, the equipment does not respond.
3. The gesture recognition device according to claim 1, further comprising:
and the gesture input module is configured to input a gesture graph and correlate at least one input gesture graph with the corresponding equipment application to be controlled to form the preset gesture graph.
4. The gesture recognition device of claim 3, wherein the gesture entry module comprises:
an input sub-module configured to receive a gesture graph input by the gesture recognition module;
the pairing submodule is configured to correlate at least one gesture graph received by the input submodule and a corresponding device application to generate a preset gesture graph, and send the generated preset gesture graph to the gesture storage module.
5. The gesture recognition apparatus according to claim 1, wherein the preset gesture pattern controls the operation and/or switching of the corresponding application program.
6. A control method of a gesture recognition device is applied to a computer, the computer comprises a display, a device keyboard and hardware, and the control method is characterized by comprising the following steps:
recognizing a current gesture graph of a user; wherein the identification is carried out through a display with a touch screen function; or, the gesture is recognized through a gesture recognition panel arranged on a keyboard or hardware of the equipment;
comparing the recognized current gesture graphs with stored preset gesture graphs one by one, wherein the preset gesture graphs are gesture graphs and control information for controlling corresponding equipment application corresponding to the gesture graphs, and at least one preset gesture graph is matched with a user using a computer, the number of the users of the computer comprises two or more, and the preset gesture graphs control operation and/or switch corresponding user operation systems;
and operating and/or switching the corresponding equipment application of the current gesture graph according to the comparison result of the gesture comparison module.
7. The control method of the gesture recognition device according to claim 6, wherein the current gesture pattern is compared with the preset gesture pattern one by one, wherein:
when the current gesture graph is matched with one preset gesture graph, sending the matched preset gesture graph;
when the current gesture graph is not matched with the preset gesture graph, the equipment does not respond.
8. The control method of a gesture recognition apparatus according to claim 6, further comprising:
inputting gesture graphs, and correlating at least one input gesture graph with corresponding equipment application to be controlled to form the preset gesture graph.
9. The control method of the gesture recognition device according to claim 8, wherein entering the gesture graphic includes:
receiving an input gesture graph;
and correlating at least one received gesture graph with corresponding equipment application to generate the preset gesture graph, and outputting and storing the generated preset gesture graph.
10. The control method of the gesture recognition device according to claim 6, wherein the preset gesture pattern controls the operation and/or switching of the corresponding application program.
11. The method as claimed in claim 6, wherein at least one predetermined gesture pattern is associated with any one of the user operating system and the application program.
12. A gesture recognition device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: recognizing a current gesture graph of a user through a display with a touch screen function; or, the gesture is recognized through a gesture recognition panel arranged on a keyboard or hardware of the equipment; comparing the recognized current gesture graphs with stored preset gesture graphs one by one, wherein the preset gesture graphs are gesture graphs and control information for controlling corresponding equipment application corresponding to the gesture graphs, and at least one preset gesture graph is matched with a user using a computer, the number of the users of the computer comprises two or more, and the preset gesture graphs control operation and/or switch corresponding user operation systems; and operating and/or switching the corresponding equipment application of the current gesture graph according to the comparison result of the gesture comparison module.
CN201611048931.4A 2016-11-23 2016-11-23 Gesture recognition device, equipment and control method of gesture recognition device Active CN106775377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611048931.4A CN106775377B (en) 2016-11-23 2016-11-23 Gesture recognition device, equipment and control method of gesture recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611048931.4A CN106775377B (en) 2016-11-23 2016-11-23 Gesture recognition device, equipment and control method of gesture recognition device

Publications (2)

Publication Number Publication Date
CN106775377A CN106775377A (en) 2017-05-31
CN106775377B true CN106775377B (en) 2020-12-18

Family

ID=58910489

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611048931.4A Active CN106775377B (en) 2016-11-23 2016-11-23 Gesture recognition device, equipment and control method of gesture recognition device

Country Status (1)

Country Link
CN (1) CN106775377B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107181974A (en) * 2017-07-18 2017-09-19 无锡路通视信网络股份有限公司 The gesture control device and control method of a kind of set top box
CN108717325B (en) * 2018-04-18 2020-08-25 Oppo广东移动通信有限公司 Operation gesture setting method and device and mobile terminal
CN108509049B (en) * 2018-04-19 2020-04-10 北京华捷艾米科技有限公司 Method and system for inputting gesture function
CN110245586A (en) * 2019-05-28 2019-09-17 贵州卓霖科技有限公司 A kind of data statistical approach based on gesture identification, system, medium and equipment
CN116247766A (en) * 2021-03-15 2023-06-09 荣耀终端有限公司 Wireless charging system, chip and wireless charging circuit

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482796A (en) * 2009-02-11 2009-07-15 中兴通讯股份有限公司 System and method for starting mobile terminal application function through touch screen
CN104182240A (en) * 2013-05-22 2014-12-03 中国移动通信集团公司 Method and device for starting application programs and mobile terminal

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102778954B (en) * 2012-06-29 2016-05-11 Tcl集团股份有限公司 A kind of gesture operation management method and device
CN102929555B (en) * 2012-10-29 2015-07-08 东莞宇龙通信科技有限公司 Terminal and application program uninstalling method
CN103092517A (en) * 2013-01-22 2013-05-08 广东欧珀移动通信有限公司 Method and device for achieving opening preset program through gesture operation quickly
CN103543946A (en) * 2013-10-28 2014-01-29 Tcl通讯(宁波)有限公司 Gesture recognition based mobile terminal awakening and unlocking method and gesture recognition based mobile terminal awakening and unlocking system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482796A (en) * 2009-02-11 2009-07-15 中兴通讯股份有限公司 System and method for starting mobile terminal application function through touch screen
CN104182240A (en) * 2013-05-22 2014-12-03 中国移动通信集团公司 Method and device for starting application programs and mobile terminal

Also Published As

Publication number Publication date
CN106775377A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN107919123B (en) Multi-voice assistant control method, device and computer readable storage medium
CN106572299B (en) Camera opening method and device
US10175671B2 (en) Method and apparatus for controlling intelligent device
CN106775377B (en) Gesture recognition device, equipment and control method of gesture recognition device
CN107102772B (en) Touch control method and device
CN105183364A (en) Application switching method, application switching device and application switching equipment
CN106537319A (en) Screen-splitting display method and device
CN106357934B (en) Screen locking control method and device
CN110262692B (en) Touch screen scanning method, device and medium
CN105677023A (en) Information presenting method and device
CN112181265B (en) Touch signal processing method, device and medium
CN107132983B (en) Split-screen window operation method and device
CN108874450B (en) Method and device for waking up voice assistant
CN108766427B (en) Voice control method and device
CN106302342B (en) User account switching method and device
CN106919302B (en) Operation control method and device of mobile terminal
CN107948876B (en) Method, device and medium for controlling sound box equipment
CN111225111A (en) Function control method, function control device, and storage medium
CN107422911B (en) Pressure value detection method and device and computer readable storage medium
CN111610921A (en) Gesture recognition method and device
CN114296628A (en) Display page control method and device, keyboard, electronic equipment and storage medium
CN103973883A (en) Method and device for controlling voice input device
CN114296587A (en) Cursor control method and device, electronic equipment and storage medium
CN110769114B (en) Call interface display method and device and mobile terminal
CN106527954B (en) Equipment control method and device and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant