CN105807900A - Non-contact type gesture control method and intelligent terminal - Google Patents

Non-contact type gesture control method and intelligent terminal Download PDF

Info

Publication number
CN105807900A
CN105807900A CN201410844598.2A CN201410844598A CN105807900A CN 105807900 A CN105807900 A CN 105807900A CN 201410844598 A CN201410844598 A CN 201410844598A CN 105807900 A CN105807900 A CN 105807900A
Authority
CN
China
Prior art keywords
gesture
intelligent terminal
track
characteristic value
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410844598.2A
Other languages
Chinese (zh)
Inventor
郑少华
黎剑辉
张圳
朱伟
朱一伟
罗海彬
湛浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fantem Technologies Shenzhen Co Ltd
Original Assignee
Fantem Technologies Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fantem Technologies Shenzhen Co Ltd filed Critical Fantem Technologies Shenzhen Co Ltd
Priority to CN201410844598.2A priority Critical patent/CN105807900A/en
Publication of CN105807900A publication Critical patent/CN105807900A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a non-contact type gesture control method and an intelligent terminal. In the embodiment of the invention, the intelligent terminal detects a gesture signal and obtains a gesture track corresponding to the gesture signal through a sensor arranged on the intelligent terminal, wherein the gesture signal is correspondingly generated through the gesture action triggered by a user; furthermore, the gesture action triggered by the user is not contacted with the intelligent terminal; a track characteristic value corresponding to the gesture track is extracted; an operation event matched with the rack characteristic value is searched by utilizing the mapping relationship of a preset track characteristic value and the operation event; and the operation event matched with the track characteristic value is executed according to the search result. Compared with the non-contact type gesture control operation manner based on image identification in the prior art, the embodiment of the invention has the advantages that: the identification rate of non-contact type gesture control is increased; and incorrect operation of the intelligent terminal based on the non-contact type gesture is reduced.

Description

Non-contact gesture control method and intelligent terminal
Technical Field
The invention relates to a gesture control technology, in particular to a non-contact gesture control method and an intelligent terminal.
Background
With the continuous improvement of daily living standards and the continuous progress of science and technology of people, the requirements of consumers on the intelligent terminal are higher and higher, and meanwhile, the intelligent terminal is expected to be more and more intelligent; the control of intelligent equipment through gestures is one of the mainstream control technologies in the current man-machine interaction operation. Because the non-contact gesture control can input corresponding information only through specific hand actions, the purpose of controlling the intelligent terminal is achieved, and therefore the non-contact gesture control of the intelligent terminal is more convenient and interesting compared with traditional input control of a mouse, a keyboard and the like.
The current non-contact gesture control technology mainly extracts images with depth information in space based on an image recognition technology, analyzes gesture actions in the images, and then executes corresponding control commands, so that the purpose of non-contact gesture control is achieved. The operation mode has low recognition rate of the user gesture, and is easy to cause misoperation of the intelligent terminal.
Disclosure of Invention
In view of this, it is necessary to provide a non-contact gesture control method and an intelligent terminal, aiming at improving the recognition rate of non-contact gesture control and reducing the misoperation of the intelligent terminal performed based on the non-contact gesture.
The embodiment of the invention discloses a non-contact gesture control method, which comprises the following steps:
the method comprises the steps that the intelligent terminal detects gesture signals through a sensor arranged on the intelligent terminal and obtains gesture tracks corresponding to the gesture signals; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal;
extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
and executing the operation event matched with the track characteristic value according to the search result.
Preferably, the method further comprises the step of:
and setting a mapping relation between the track characteristic value and the corresponding operation event.
Preferably, the step of setting the mapping relationship between the track characteristic value and the corresponding operation event includes:
acquiring a gesture signal generated by a user-defined gesture action, and acquiring a corresponding gesture track;
extracting a track characteristic value corresponding to the gesture track;
configuring an operation event to be executed correspondingly for the extracted track characteristic value;
and storing the mapping relation between the track characteristic value and the corresponding operation event.
Preferably, the step of detecting the gesture signal by the intelligent terminal through a sensor arranged on the intelligent terminal includes:
the intelligent terminal is provided with an infrared sensor, and acquires a corresponding gesture signal according to an infrared signal which reflects gesture track change and is detected by the infrared sensor;
or,
the intelligent terminal is provided with a light sensor, and the corresponding gesture signal is obtained according to a light intensity signal which reflects light intensity change and is detected by the light sensor;
or,
an ultrasonic sensor is arranged on the intelligent terminal, and the corresponding gesture signal is obtained according to an ultrasonic signal which reflects the change of the amplitude of the sound wave and is detected by the ultrasonic sensor;
or,
and a capacitive sensor is arranged on the intelligent terminal, and the corresponding gesture signal is acquired according to an electric signal which reflects the gesture track change and is detected by the capacitive sensor.
Preferably, the intelligent terminal receives an operation request triggered by a user, and updates the set mapping relationship between the track characteristic value and the corresponding operation event.
The embodiment of the invention also discloses an intelligent terminal, which comprises:
the detection module is used for detecting a gesture signal through a sensor arranged on the intelligent terminal and acquiring a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal;
the acquisition module is used for extracting a track characteristic value corresponding to the gesture track, and searching an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
and the execution module is used for executing the operation event matched with the track characteristic value according to the search result.
Preferably, the intelligent terminal further comprises:
and the setting module is used for setting the mapping relation between the track characteristic value and the corresponding operation event.
Preferably, the setting module is further configured to:
acquiring a gesture signal generated by a user-defined gesture action, and acquiring a corresponding gesture track;
extracting a track characteristic value corresponding to the gesture track;
configuring an operation event to be executed correspondingly for the extracted track characteristic value;
and storing the mapping relation between the track characteristic value and the corresponding operation event.
Preferably, the detection module is further configured to:
detecting an infrared signal reflecting the change of the gesture track by using an infrared sensor arranged on the intelligent terminal, and acquiring a corresponding gesture signal;
or,
detecting a light intensity signal reflecting light intensity change by using a light sensor arranged on the intelligent terminal to obtain a corresponding gesture signal;
or,
detecting an ultrasonic signal reflecting the amplitude change of the sound wave by using an ultrasonic sensor arranged on the intelligent terminal, and acquiring a corresponding gesture signal;
or,
and detecting an electric signal reflecting capacitance change by using a capacitance sensor arranged on the intelligent terminal to obtain the corresponding gesture signal.
Preferably, the setting module is further configured to:
and receiving an operation request triggered by a user, and updating the set mapping relation between the track characteristic value and the corresponding operation event.
The intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal, and obtains a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal; extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event; executing an operation event matched with the track characteristic value according to the search result; compared with the operation mode of carrying out non-contact gesture control based on image recognition in the prior art, the embodiment of the invention improves the recognition rate of the non-contact gesture control and reduces the misoperation of the intelligent terminal based on the non-contact gesture.
Drawings
FIG. 1 is a flowchart illustrating a non-contact gesture control method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating a non-contact gesture control method according to a second embodiment of the present invention;
FIG. 3 is a flowchart illustrating an embodiment of a mapping relationship between a terminal-set trajectory feature value and a corresponding operation event in the non-contact gesture control method according to the present invention;
FIG. 4 is a functional block diagram of a first embodiment of the intelligent terminal according to the present invention;
fig. 5 is a schematic diagram of functional modules of a second embodiment of the intelligent terminal of the present invention.
The implementation, functional features and advantages of the objects of the embodiments of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
The technical scheme of the invention is further explained by combining the drawings and the specific embodiments in the specification. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The number of the intelligent terminals described in the invention is at least one, and the intelligent terminals can be any intelligent terminals such as a smart phone and a tablet personal computer. Those skilled in the art can understand that the embodiment of the present invention does not limit the specific representation form of the intelligent terminal as long as the idea of the non-contact gesture control method of the present invention can be implemented. The description of the intelligent terminal is as described above, and in the following specific embodiment that describes the corresponding technical solution, the above description is not repeated.
Referring to fig. 1, fig. 1 is a schematic flow chart of a non-contact gesture control method according to a first embodiment of the present invention; as shown in FIG. 1, the non-contact gesture control method of the present invention comprises the following steps:
step S11, the intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal, and obtains a gesture track corresponding to the gesture signal;
in the embodiment of the invention, the intelligent terminal is provided with a corresponding sensor; the specific number of the sensors is set according to needs, and the specific configuration number of the sensors is not limited in the embodiment of the invention. The sensor arranged on the intelligent terminal is used for detecting a gesture signal generated by the gesture change of the user; the gesture signal is generated correspondingly to a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal.
After detecting a gesture signal generated by a non-contact gesture action of a user, the intelligent terminal acquires a gesture track corresponding to the gesture signal.
In a preferred embodiment of the present invention, the detection modes of the intelligent terminal, through the sensor arranged on the intelligent terminal, for detecting the gesture signal corresponding to the gesture action triggered by the user include, but are not limited to, the following:
the intelligent terminal is provided with an infrared sensor, and the corresponding gesture signal is obtained according to the infrared signal which reflects the gesture track change and is detected by the infrared sensor. Or, a light sensor is arranged on the intelligent terminal, and the corresponding gesture signal is acquired according to a light intensity signal which reflects light intensity change and is detected by the light sensor. Or, an ultrasonic sensor is arranged on the intelligent terminal, and the corresponding gesture signal is obtained according to the ultrasonic signal which is detected by the ultrasonic sensor and reflects the amplitude change of the sound wave. Or, a capacitance sensor is arranged on the intelligent terminal, and the corresponding gesture signal is obtained according to the electric signal which reflects capacitance change and is detected by the capacitance sensor.
Those skilled in the art can understand that the embodiment of the present invention is not limited as to which sensor is specifically adopted to detect the gesture signal, which is triggered by the user and corresponds to the non-contact gesture motion of the intelligent terminal, by the sensor arranged on the intelligent terminal.
After the intelligent terminal detects a gesture signal corresponding to the gesture action of the user, a gesture track corresponding to the gesture signal is obtained according to the gesture signal. The gesture track acquired by the intelligent terminal can be understood as follows: and the gesture action triggered by the user is mapped to the operation track.
Step S12, extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
after the intelligent terminal acquires a gesture track corresponding to a gesture signal, extracting a track characteristic value corresponding to the gesture track according to key points on the gesture track; and the track characteristic value corresponding to the gesture track can uniquely determine the specific form of the gesture track. One extraction mode for the intelligent terminal to extract the trajectory characteristic value corresponding to the gesture trajectory may be: obtaining a track key point which uniquely determines the gesture track, extracting the track key point and obtaining the coordinate of the key point; and taking the coordinates of the key points as input parameters, and acquiring a track characteristic value corresponding to the gesture track according to a preset algorithm.
After the track characteristic value corresponding to the track is obtained, the intelligent terminal searches the operation event matched with the track characteristic value according to the preset mapping relation between the track characteristic value and the operation event.
And step S13, executing the operation event matched with the track characteristic value according to the search result.
The intelligent terminal determines an operation event matched with the track characteristic value according to the search result, namely the operation event which needs to be executed by the intelligent terminal; and after the operation event is determined, the intelligent terminal executes the operation event matched with the track characteristic value.
For example, if the operation event matched with the trajectory feature value is to light a screen, the intelligent terminal executes an operation of lighting the screen.
The intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal, and obtains a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal; extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event; executing an operation event matched with the track characteristic value according to the search result; compared with the operation mode of carrying out non-contact gesture control based on image recognition in the prior art, the embodiment of the invention improves the recognition rate of the non-contact gesture control and reduces the misoperation of the intelligent terminal based on the non-contact gesture.
Based on the description of the embodiment shown in fig. 1, please refer to fig. 2, and fig. 2 is a flowchart illustrating a non-contact gesture control method according to a second embodiment of the present invention; the difference between the embodiment of the present invention and the embodiment shown in fig. 1 is that before the intelligent terminal executes the function of non-contact gesture control operation, a trajectory characteristic value of a gesture trajectory corresponding to a gesture action triggered by a user needs to be preset, and a mapping relationship between the trajectory characteristic value and a corresponding operation event.
The setting action is only needed to be executed when the mobile terminal is used for the first time, and the setting action is only needed to be executed once under the condition that other factors are not considered; after the setting is completed, the set mapping relationship between the track characteristic value and the operation event can be updated according to specific requirements.
As shown in fig. 2, before the intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal in step S11 and obtains a gesture track corresponding to the gesture signal in the embodiment of fig. 1, the method for controlling a non-contact gesture further includes the steps of:
and step S10, setting the mapping relation between the track characteristic value and the corresponding operation event.
In the embodiment of the invention, the intelligent terminal configures and stores the mapping relation between the track characteristic value corresponding to the gesture track and the operation event based on the setting operation triggered by the user. In a preferred embodiment of the present invention, different track characteristic values and their corresponding operation events may be set in a table form, that is, the mapping relationship between the track characteristic values and the operation events is stored in a mapping relationship table form. When the updating condition of the mapping relation table is triggered, the intelligent terminal responds to a corresponding updating instruction to update the mapping relation table; for example, the intelligent terminal receives an operation request triggered by a user, and updates the set mapping relationship between the track characteristic value and the corresponding operation event.
The mapping relationship between the track characteristic value of the intelligent terminal updating gesture track and the corresponding operation event includes, but is not limited to, the following: adding a mapping relation between the track characteristic value of the gesture track and the corresponding operation event; or deleting the mapping relation between the track characteristic value of the stored gesture track and the corresponding operation event; or changing the mapping relation between the track characteristic value of the stored gesture track and the corresponding operation event, and the like. The embodiment of the present invention does not specifically limit the updating content of the intelligent terminal for updating the mapping relationship between the trajectory characteristic value and the corresponding operation event.
According to the embodiment of the invention, the intelligent terminal improves the recognition efficiency of recognizing the operation event corresponding to the non-contact gesture by the intelligent terminal by setting the mapping relation between the track characteristic value of the gesture track and the corresponding operation event, so that the execution efficiency of executing the non-contact gesture control by the intelligent terminal is improved.
Referring to fig. 3 based on the description of the embodiments of fig. 1 and fig. 2, fig. 3 is a schematic flow chart illustrating an embodiment of a mapping relationship between a terminal-set trajectory feature value and a corresponding operation event in the non-contact gesture control method according to the present invention; in the embodiments of the present invention, only one specific setting manner in the mapping relationship between the trajectory characteristic value of the gesture trajectory and the corresponding operation event is described, and for other descriptions related to the non-contact gesture control method of the present invention, please refer to the description of the corresponding embodiments, which is not described herein again.
As shown in fig. 3, in the non-contact gesture control method according to the embodiment of the present invention, the mapping relationship between the trajectory feature value and the corresponding operation event may be set in the following manner:
s21, acquiring gesture signals generated by user-defined gesture actions, and acquiring corresponding gesture tracks;
in the embodiment of the invention, the intelligent terminal acquires the gesture signal generated by the user-defined gesture action by using the set sensor, for example, acquires the corresponding gesture signal by using the light sensor, the ultrasonic sensor, the infrared sensor, the capacitive sensor or other sensors, and acquires the corresponding gesture track according to the acquired gesture signal.
S22, extracting a track characteristic value corresponding to the gesture track;
after the intelligent terminal acquires a gesture track corresponding to a gesture signal, extracting a track characteristic value corresponding to the gesture track according to key points on the gesture track; and the track characteristic value corresponding to the gesture track can uniquely determine the specific form of the gesture track. One extraction mode for the intelligent terminal to extract the trajectory characteristic value corresponding to the gesture trajectory may be: obtaining a track key point which uniquely determines the gesture track, extracting the track key point and obtaining the coordinate of the key point; and taking the coordinates of the key points as input parameters, and acquiring a track characteristic value corresponding to the gesture track according to a preset algorithm.
Step S23, configuring the operation event to be executed correspondingly for the extracted track characteristic value;
and step S24, storing the mapping relation between the track characteristic value and the corresponding operation event.
And configuring an operation event to be executed correspondingly for the extracted track characteristic value based on an operation instruction triggered by a user, generating a mapping relation between the track characteristic value and the corresponding operation event, and storing the mapping relation.
According to the embodiment of the invention, the intelligent terminal realizes the configuration of the mapping relation between the track characteristic value and the corresponding operation event.
Referring to fig. 4, fig. 4 is a schematic diagram of a functional module of an intelligent terminal according to a first embodiment of the present invention; as shown in fig. 4, in the embodiment of the present invention, the intelligent terminal includes: the device comprises a detection module 01, an acquisition module 02 and an execution module 03.
Wherein the detection module 01 is configured to: detecting a gesture signal through a sensor arranged on the intelligent terminal, and acquiring a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal;
in the embodiment of the invention, the intelligent terminal is provided with a corresponding sensor; the specific number of the sensors is set according to needs, and the specific configuration number of the sensors is not limited in the embodiment of the invention. The detection module 01 detects a gesture signal generated by the gesture change of the user based on the sensor arranged on the intelligent terminal; the gesture signal is generated correspondingly to a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal.
After the detection module 01 detects a gesture signal generated by a non-contact gesture action of a user, a gesture track corresponding to the gesture signal is obtained.
In a preferred embodiment of the present invention, the detection module 01 detects, through a sensor arranged on the intelligent terminal, gesture signals corresponding to gesture actions triggered by a user in a detection manner, which includes but is not limited to the following:
the intelligent terminal is provided with an infrared sensor, and the detection module 01 acquires the corresponding gesture signal according to the infrared signal which reflects the gesture track change and is detected by the infrared sensor. Or, a light sensor is arranged on the intelligent terminal, and the detection module 01 acquires the corresponding gesture signal according to a light intensity signal which is detected by the light sensor and reflects the change of the light intensity. Or, an ultrasonic sensor is arranged on the intelligent terminal, and the detection module 01 acquires the corresponding gesture signal according to the ultrasonic signal which is detected by the ultrasonic sensor and reflects the amplitude change of the sound wave. Or, a capacitance sensor is arranged on the intelligent terminal, and the detection module 01 acquires the corresponding gesture signal according to an electric signal which reflects capacitance change and is detected by the capacitance sensor.
Those skilled in the art can understand that the detection module 01 only needs to detect a gesture signal corresponding to a non-contact gesture motion of the intelligent terminal triggered by a user through a sensor arranged on the intelligent terminal, and as for what kind of sensor is specifically adopted on the intelligent terminal for detection, the embodiment of the present invention does not limit the detection.
After the detection module 01 detects a gesture signal corresponding to a gesture action of a user, a gesture track corresponding to the gesture signal is obtained according to the gesture signal. The gesture track acquired by the detection module 01 can also be understood as: and the gesture action triggered by the user is mapped to the operation track.
The obtaining module 02 is configured to: extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
after the detection module 01 acquires a gesture track corresponding to a gesture signal, the acquisition module 02 extracts a track characteristic value corresponding to the gesture track according to key points on the gesture track; and the track characteristic value corresponding to the gesture track can uniquely determine the specific form of the gesture track. One of the extraction modes of the obtaining module 02 for extracting the trajectory feature value corresponding to the gesture trajectory may be: obtaining a track key point which uniquely determines the gesture track, extracting the track key point and obtaining the coordinate of the key point; and taking the coordinates of the key points as input parameters, and acquiring a track characteristic value corresponding to the gesture track according to a preset algorithm.
After obtaining the track characteristic value corresponding to the track, the obtaining module 02 searches for the operation event matched with the track characteristic value according to a preset mapping relationship between the track characteristic value and the operation event.
The execution module 03 is configured to: and executing the operation event matched with the track characteristic value according to the search result.
The executing module 03 determines, according to the search result of the obtaining module 02, an operation event that matches the trajectory eigenvalue, that is, an operation event that the executing module 03 needs to execute; after the operation event is determined, the execution module 03 executes the operation event matched with the trajectory feature value.
For example, if the operation event that the trajectory feature value matches is to light a screen, the execution module 03 executes an operation of lighting the screen.
The intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal, and obtains a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal; extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event; executing an operation event matched with the track characteristic value according to the search result; compared with the operation mode of carrying out non-contact gesture control based on image recognition in the prior art, the embodiment of the invention improves the recognition rate of the non-contact gesture control and reduces the misoperation of the intelligent terminal based on the non-contact gesture.
Referring to fig. 5, based on the description of the embodiment shown in fig. 4, fig. 5 is a schematic diagram of a functional module of an intelligent terminal according to a second embodiment of the present invention; the difference between the embodiment of the present invention and the embodiment described in fig. 4 is that before the intelligent terminal executes the function of non-contact gesture control operation, a trajectory characteristic value of a gesture trajectory corresponding to a gesture action triggered by a user needs to be preset, and a mapping relationship between the trajectory characteristic value and a corresponding operation event.
The setting action is only needed to be executed when the mobile terminal is used for the first time, and the setting action is only needed to be executed once under the condition that other factors are not considered; after the setting is completed, the set mapping relationship between the track characteristic value and the operation event can be updated according to specific requirements. This setting operation will be done by the setting module 04 on the smart terminal.
As shown in fig. 5, the intelligent terminal further includes:
and the setting module 04 is configured to set a mapping relationship between the track characteristic value and the corresponding operation event.
In the embodiment of the present invention, the setting module 04 configures and stores a mapping relationship between the trajectory characteristic value corresponding to the gesture trajectory and the operation event based on a setting operation triggered by a user. The setting module 04 may set the mapping relationship between the track characteristic value and the corresponding operation event by:
acquiring a gesture signal generated by a user-defined gesture action, and acquiring a corresponding gesture track; extracting a track characteristic value corresponding to the gesture track; configuring an operation event to be executed correspondingly for the extracted track characteristic value; and storing the mapping relation between the track characteristic value and the corresponding operation event.
In a preferred embodiment of the present invention, the setting module 04 may set different track characteristic values and their corresponding operation events in a table form, that is, a mapping relationship between the track characteristic values and the operation events is stored in a form of a mapping relationship table.
When the update condition of the mapping table is triggered, the setting module 04 responds to a corresponding update instruction to update the mapping table; for example, the setting module 04 receives an operation request triggered by a user, and updates the set mapping relationship between the track characteristic value and the corresponding operation event.
The setting module 04 updates the mapping relationship between the trajectory characteristic value of the gesture trajectory and the corresponding operation event, including but not limited to the following: adding a mapping relation between the track characteristic value of the gesture track and the corresponding operation event; or deleting the mapping relation between the track characteristic value of the stored gesture track and the corresponding operation event; or changing the mapping relation between the track characteristic value of the stored gesture track and the corresponding operation event, and the like. The embodiment of the present invention does not specifically limit the updating content of the setting module 04 for updating the mapping relationship between the trajectory feature value and the corresponding operation event.
According to the embodiment of the invention, the intelligent terminal improves the recognition efficiency of recognizing the operation event corresponding to the non-contact gesture by the intelligent terminal by setting the mapping relation between the track characteristic value of the gesture track and the corresponding operation event, so that the execution efficiency of executing the non-contact gesture control by the intelligent terminal is improved.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only for the preferred embodiment of the present invention and is not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes that can be directly or indirectly applied to other related technical fields using the contents of the present specification and the accompanying drawings are included in the scope of the present invention.

Claims (10)

1. A non-contact gesture control method is characterized by comprising the following steps:
the method comprises the steps that the intelligent terminal detects gesture signals through a sensor arranged on the intelligent terminal and obtains gesture tracks corresponding to the gesture signals; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal;
extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
and executing the operation event matched with the track characteristic value according to the search result.
2. The method of claim 1, wherein the method further comprises the steps of:
and setting a mapping relation between the track characteristic value and the corresponding operation event.
3. The method of claim 2, wherein the step of setting a mapping of the trajectory feature values to corresponding operational events comprises:
acquiring a gesture signal generated by a user-defined gesture action, and acquiring a corresponding gesture track;
extracting a track characteristic value corresponding to the gesture track;
configuring an operation event to be executed correspondingly for the extracted track characteristic value;
and storing the mapping relation between the track characteristic value and the corresponding operation event.
4. The method according to any one of claims 1 to 3, wherein the intelligent terminal detects the gesture signal through a sensor provided on the intelligent terminal, and the step of detecting the gesture signal comprises:
the intelligent terminal is provided with an infrared sensor, and acquires a corresponding gesture signal according to an infrared signal which reflects gesture track change and is detected by the infrared sensor;
or,
the intelligent terminal is provided with a light sensor, and the corresponding gesture signal is obtained according to a light intensity signal which reflects light intensity change and is detected by the light sensor;
or,
an ultrasonic sensor is arranged on the intelligent terminal, and the corresponding gesture signal is obtained according to an ultrasonic signal which reflects the change of the amplitude of the sound wave and is detected by the ultrasonic sensor;
or,
and a capacitive sensor is arranged on the intelligent terminal, and the corresponding gesture signal is acquired according to an electric signal which reflects the gesture track change and is detected by the capacitive sensor.
5. The method according to any one of claims 1 to 3, wherein the intelligent terminal receives an operation request triggered by a user, and updates the mapping relation between the set track characteristic value and the corresponding operation event.
6. An intelligent terminal, characterized in that, intelligent terminal includes:
the detection module is used for detecting a gesture signal through a sensor arranged on the intelligent terminal and acquiring a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal;
the acquisition module is used for extracting a track characteristic value corresponding to the gesture track, and searching an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
and the execution module is used for executing the operation event matched with the track characteristic value according to the search result.
7. The intelligent terminal of claim 6, wherein the intelligent terminal further comprises:
and the setting module is used for setting the mapping relation between the track characteristic value and the corresponding operation event.
8. The intelligent terminal of claim 7, wherein the setup module is further to:
acquiring a gesture signal generated by a user-defined gesture action, and acquiring a corresponding gesture track;
extracting a track characteristic value corresponding to the gesture track;
configuring an operation event to be executed correspondingly for the extracted track characteristic value;
and storing the mapping relation between the track characteristic value and the corresponding operation event.
9. The intelligent terminal of any of claims 6 to 8, wherein the detection module is further configured to:
detecting an infrared signal reflecting the change of the gesture track by using an infrared sensor arranged on the intelligent terminal, and acquiring a corresponding gesture signal;
or,
detecting a light intensity signal reflecting light intensity change by using a light sensor arranged on the intelligent terminal to obtain a corresponding gesture signal;
or,
detecting an ultrasonic signal reflecting the amplitude change of the sound wave by using an ultrasonic sensor arranged on the intelligent terminal, and acquiring a corresponding gesture signal;
or,
and detecting an electric signal reflecting capacitance change by using a capacitance sensor arranged on the intelligent terminal to obtain the corresponding gesture signal.
10. The intelligent terminal of any of claims 6 to 8, wherein the setup module is further configured to:
and receiving an operation request triggered by a user, and updating the set mapping relation between the track characteristic value and the corresponding operation event.
CN201410844598.2A 2014-12-30 2014-12-30 Non-contact type gesture control method and intelligent terminal Pending CN105807900A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410844598.2A CN105807900A (en) 2014-12-30 2014-12-30 Non-contact type gesture control method and intelligent terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410844598.2A CN105807900A (en) 2014-12-30 2014-12-30 Non-contact type gesture control method and intelligent terminal

Publications (1)

Publication Number Publication Date
CN105807900A true CN105807900A (en) 2016-07-27

Family

ID=56420111

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410844598.2A Pending CN105807900A (en) 2014-12-30 2014-12-30 Non-contact type gesture control method and intelligent terminal

Country Status (1)

Country Link
CN (1) CN105807900A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249899A (en) * 2016-08-15 2016-12-21 珠海格力电器股份有限公司 Gesture recognition system, electric appliance and control method thereof
CN106384033A (en) * 2016-09-29 2017-02-08 努比亚技术有限公司 Screen off method and apparatus of terminal screen
CN106384032A (en) * 2016-09-29 2017-02-08 努比亚技术有限公司 Unlocking method and apparatus for terminal
CN106445150A (en) * 2016-09-29 2017-02-22 努比亚技术有限公司 Method and device for operating terminal application
CN106445148A (en) * 2016-09-29 2017-02-22 努比亚技术有限公司 Method and device for triggering terminal application
CN106445149A (en) * 2016-09-29 2017-02-22 努比亚技术有限公司 Method and device for controlling terminal application
CN106484290A (en) * 2016-09-29 2017-03-08 努比亚技术有限公司 A kind of multistage pressing system based on proximity transducer and mobile terminal
CN106484294A (en) * 2016-09-29 2017-03-08 努比亚技术有限公司 A kind of two-stage pressing system based on proximity transducer and mobile terminal
CN106484258A (en) * 2016-09-29 2017-03-08 努比亚技术有限公司 A kind of method and device of terminal screen bright screen
CN106502385A (en) * 2016-09-22 2017-03-15 南京酷派软件技术有限公司 The control method of mobile terminal, control device and mobile terminal
CN106527685A (en) * 2016-09-30 2017-03-22 努比亚技术有限公司 Control method and device for terminal application
CN106527833A (en) * 2016-10-31 2017-03-22 努比亚技术有限公司 Method and device for sliding mobile application
CN107371051A (en) * 2017-08-22 2017-11-21 宋彦震 Concert scene gesticulation guidance system
WO2018053956A1 (en) * 2016-09-26 2018-03-29 华为技术有限公司 Interaction method, and wearable device
CN108924417A (en) * 2018-07-02 2018-11-30 Oppo(重庆)智能科技有限公司 Filming control method and Related product
CN109388240A (en) * 2018-09-25 2019-02-26 北京金茂绿建科技有限公司 A kind of non-contact gesture control method and device
CN111787223A (en) * 2020-06-30 2020-10-16 维沃移动通信有限公司 Video shooting method and device and electronic equipment
CN113419621A (en) * 2020-11-12 2021-09-21 浙江大学 Abnormal behavior detection method, page and gesture processing method, device and electronic equipment
CN116185205A (en) * 2023-04-25 2023-05-30 中国电子科技集团公司第十五研究所 Non-contact gesture interaction method and device
CN110850982B (en) * 2019-11-11 2024-03-29 南方科技大学 AR-based man-machine interaction learning method, system, equipment and storage medium

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249899A (en) * 2016-08-15 2016-12-21 珠海格力电器股份有限公司 Gesture recognition system, electric appliance and control method thereof
CN106502385A (en) * 2016-09-22 2017-03-15 南京酷派软件技术有限公司 The control method of mobile terminal, control device and mobile terminal
WO2018053956A1 (en) * 2016-09-26 2018-03-29 华为技术有限公司 Interaction method, and wearable device
CN108139798A (en) * 2016-09-26 2018-06-08 华为技术有限公司 Exchange method and wearable device
CN106384033A (en) * 2016-09-29 2017-02-08 努比亚技术有限公司 Screen off method and apparatus of terminal screen
CN106384032A (en) * 2016-09-29 2017-02-08 努比亚技术有限公司 Unlocking method and apparatus for terminal
CN106445150A (en) * 2016-09-29 2017-02-22 努比亚技术有限公司 Method and device for operating terminal application
CN106445148A (en) * 2016-09-29 2017-02-22 努比亚技术有限公司 Method and device for triggering terminal application
CN106445149A (en) * 2016-09-29 2017-02-22 努比亚技术有限公司 Method and device for controlling terminal application
CN106484290A (en) * 2016-09-29 2017-03-08 努比亚技术有限公司 A kind of multistage pressing system based on proximity transducer and mobile terminal
CN106484294A (en) * 2016-09-29 2017-03-08 努比亚技术有限公司 A kind of two-stage pressing system based on proximity transducer and mobile terminal
CN106484258A (en) * 2016-09-29 2017-03-08 努比亚技术有限公司 A kind of method and device of terminal screen bright screen
CN106527685A (en) * 2016-09-30 2017-03-22 努比亚技术有限公司 Control method and device for terminal application
CN106527833A (en) * 2016-10-31 2017-03-22 努比亚技术有限公司 Method and device for sliding mobile application
CN107371051A (en) * 2017-08-22 2017-11-21 宋彦震 Concert scene gesticulation guidance system
CN107371051B (en) * 2017-08-22 2019-10-01 杭州勒格网络科技有限公司 Concert scene gesticulation guidance system
CN108924417A (en) * 2018-07-02 2018-11-30 Oppo(重庆)智能科技有限公司 Filming control method and Related product
CN109388240A (en) * 2018-09-25 2019-02-26 北京金茂绿建科技有限公司 A kind of non-contact gesture control method and device
CN110850982B (en) * 2019-11-11 2024-03-29 南方科技大学 AR-based man-machine interaction learning method, system, equipment and storage medium
CN111787223A (en) * 2020-06-30 2020-10-16 维沃移动通信有限公司 Video shooting method and device and electronic equipment
CN111787223B (en) * 2020-06-30 2021-07-16 维沃移动通信有限公司 Video shooting method and device and electronic equipment
CN113419621A (en) * 2020-11-12 2021-09-21 浙江大学 Abnormal behavior detection method, page and gesture processing method, device and electronic equipment
CN116185205A (en) * 2023-04-25 2023-05-30 中国电子科技集团公司第十五研究所 Non-contact gesture interaction method and device

Similar Documents

Publication Publication Date Title
CN105807900A (en) Non-contact type gesture control method and intelligent terminal
US9207771B2 (en) Gesture based user interface
CN103353935B (en) A kind of 3D dynamic gesture identification method for intelligent domestic system
CN103529762B (en) A kind of intelligent home furnishing control method based on sensor technology and system
CN105518571B (en) For being based on the exectorial method and apparatus of context aware
US10126825B2 (en) Method for recognizing handwriting on a physical surface
US10126858B2 (en) Touch display device and touch method thereof
CN105353880B (en) A kind of application program launching control system and method based on finger portion information
CN105045495B (en) A kind of control method and mobile terminal of mobile terminal
CN104346127A (en) Realization method, realization device and terminal for voice input
CN104978133A (en) Screen capturing method and screen capturing device for intelligent terminal
CA2879057A1 (en) Method and apparatus for controlling application by handwriting image recognition
CN103699260B (en) A kind of method starting termination function module and terminal unit
EP2612222A1 (en) Use camera to augment input for portable electronic device
CN108762489B (en) Control method based on data glove, system and storage medium
CN104850433A (en) Mobile terminal application starting method and mobile terminal
WO2017032006A1 (en) Method and apparatus for displaying information
CN109976656B (en) Page turning control method for browsing webpage and user terminal
CN109246525B (en) Gesture-based earphone control method and device and headset
CN105205377A (en) Device, method and mobile terminal for preventing mistakenly starting of camera
CN108984096A (en) touch operation method, device, storage medium and electronic equipment
US10146372B2 (en) Method for controlling blank screen gesture processing and terminal
CN103324410A (en) Method and apparatus for detecting touch
CN106873889A (en) Fingerprint operation method and device and electronic equipment
CN104915003A (en) Somatosensory control parameter adjusting method, somatosensory interaction system and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160727