CN110362206B - Gesture detection method, gesture detection device, terminal and computer readable storage medium - Google Patents
Gesture detection method, gesture detection device, terminal and computer readable storage medium Download PDFInfo
- Publication number
- CN110362206B CN110362206B CN201910644106.8A CN201910644106A CN110362206B CN 110362206 B CN110362206 B CN 110362206B CN 201910644106 A CN201910644106 A CN 201910644106A CN 110362206 B CN110362206 B CN 110362206B
- Authority
- CN
- China
- Prior art keywords
- gesture
- terminal
- light intensity
- intensity value
- change
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application belongs to the technical field of user interaction, and particularly relates to a gesture detection method, a gesture detection device, a gesture detection terminal and a computer readable storage medium, wherein the gesture detection method comprises the following steps: acquiring a light intensity value acquired by a light sensor and a distance value acquired by a distance sensor; determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value; generating a gesture triggering instruction according to the movement direction, wherein the gesture triggering instruction is used for indicating a terminal to execute a corresponding function; the problem that the terminal cannot be triggered to realize corresponding functions when the user cannot click or slide the touch screen due to the fact that the user takes gloves or hands to be stained with stains or water stains is avoided, the operation requirements of the user in different scenes are met, and the triggering efficiency of the terminal is improved.
Description
Technical Field
The application belongs to the technical field of user interaction, and particularly relates to a gesture detection method, a gesture detection device, a gesture detection terminal and a computer readable storage medium.
Background
Along with the rapid development of mobile communication technology, the intelligent terminal has more and more functions, and can be used for not only receiving and sending short messages and making calls, but also surfing shopping, navigation, game playing and the like. And the original realization of various functions by pressing the key trigger terminal is developed into the current realization of various functions by clicking or sliding the touch screen trigger terminal.
However, when the user takes a glove or hands to be stained with dirt or water stain and cannot click or slide the touch screen, the terminal cannot be triggered to realize the corresponding function, and the triggering efficiency is low.
Disclosure of Invention
The embodiment of the application provides a gesture detection method, a gesture detection device, a gesture detection terminal and a computer readable storage medium, which can solve the technical problem that the terminal cannot be triggered to realize corresponding functions in a specific scene.
The first aspect of the embodiment of the application provides a gesture detection method, which is applied to a terminal, wherein the terminal is provided with a light sensor and a distance sensor; the gesture detection method comprises the following steps:
acquiring a light intensity value acquired by a light sensor and a distance value acquired by a distance sensor;
determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value;
and generating a gesture triggering instruction according to the movement direction, wherein the gesture triggering instruction is used for indicating the terminal to execute a corresponding function.
The second aspect of the embodiment of the application provides a gesture detection device, which is configured on a terminal, wherein the terminal is provided with a light sensor and a distance sensor; the gesture detection device includes:
the acquisition unit acquires a light intensity value acquired by the light sensor and a distance value acquired by the distance sensor;
the determining unit is used for determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value;
and the generating unit is used for generating a gesture triggering instruction according to the movement direction, wherein the gesture triggering instruction is used for indicating the terminal to execute a corresponding function.
A third aspect of the embodiments of the present application provides a terminal comprising a light sensor, a distance sensor, a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the above method when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, performs the steps of the above method.
According to the embodiment of the application, the movement direction of the current gesture is determined according to the time sequence of the change of the light intensity value and the distance value, and the gesture trigger instruction is generated according to the movement direction, so that the terminal is triggered to execute the corresponding function, the problem that the terminal cannot be triggered to realize the corresponding function when the user is stained with a glove or hands and cannot click or slide a touch screen is avoided, the operation requirements of the user in different scenes are met, and the triggering efficiency of the terminal is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation flow of a gesture detection method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of the locations of the light sensor and the distance sensor on the terminal according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a sensing area of a photosensor and a sensing area of a distance sensor according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a specific implementation of step 102 of a gesture detection method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a specific implementation flow of step 103 of a gesture detection method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a specific implementation flow of step 401 of a gesture detection method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a gesture motion direction provided by an embodiment of the present application;
FIG. 8 is a flowchart illustrating a specific implementation of step 402 of a gesture detection method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a gesture detection apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
At present, in the process of triggering the terminal to realize various functions by clicking or sliding the touch screen, if the user brings gloves or hands to be stained with stains or water stains, the terminal can not be triggered to realize corresponding functions, so the triggering mode of the terminal still has the problem that the operation requirements of the user in different scenes can not be met at present, and the triggering efficiency is low.
Based on this, embodiments of the present application provide a gesture detection method, a gesture detection device, a terminal, and a computer readable storage medium, which can improve the triggering efficiency of the terminal.
Fig. 1 shows a schematic implementation flow chart of a gesture detection method according to an embodiment of the present application, where the method is applied to a terminal, and may be executed by a gesture detection device configured on the terminal, and is suitable for a situation requiring triggering efficiency of the terminal. The terminal may be an intelligent terminal such as a mobile phone, a tablet computer, a wearable device, and the like, and the gesture detection method may include steps 101 to 103.
Step 101, acquiring a light intensity value acquired by a light sensor and a distance value acquired by a distance sensor.
In the embodiment of the application, the terminal is provided with a light sensor and a distance sensor. The light sensor is used for collecting the light intensity value of the current environment of the terminal in real time, and the distance sensor is used for detecting the distance value of an object near the terminal from the terminal in real time.
The gesture of the user is detected by the optical sensor and the distance sensor arranged on the terminal.
Specifically, the optical sensor and the distance sensor are located at different positions on the terminal and have different sensing areas, and the sensing areas may or may not overlap with each other.
For example, as shown in fig. 2, the positions of the photosensor 21 and the distance sensor 22 on the terminal are schematically shown, and as shown in fig. 3, the sensing region 31 (solid line region) of the photosensor and the sensing region 32 (broken line region) of the distance sensor are schematically shown.
It should be noted that, for illustration, the protection scope of the present application is not limited by this description, and it is understood that the optical sensor and the distance sensor may be disposed at other positions of the terminal.
Step 102, determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value.
In the embodiment of the application, after the light intensity value acquired by the light sensor and the distance value acquired by the distance sensor are acquired, the movement direction of the current gesture of the user can be determined.
For example, as shown in fig. 4, determining the motion direction of the current gesture according to the time sequence in which the light intensity value and the distance value change may include: steps 401 to 402.
In step 401, if the time when the light intensity value and the distance value change is the same time, it is determined that the motion direction of the current gesture is the vertical direction.
In step 402, if the time when the light intensity value and the distance value change is not the same time, it is determined that the motion direction of the current gesture is the horizontal direction.
Specifically, since the light sensor and the distance sensor are located at different positions on the terminal and have different sensing areas, when the hand of the user moves horizontally with respect to the terminal, the light intensity value collected by the light sensor and the distance value collected by the distance sensor will change at different moments, and when the hand of the user moves vertically with respect to the terminal, the light intensity value collected by the light sensor and the distance value collected by the distance sensor will change at the same moment.
For example, as shown in fig. 3, when the user's hand moves horizontally from the solid line area to the dotted line area, the light intensity value collected by the light sensor may first change from an initial value (for example, the initial value is 0), and at this time, the distance value collected by the distance sensor does not change, whereas when the user's hand moves horizontally from the dotted line area to the solid line area, the distance value collected by the distance sensor may first change from an initial value (for example, the initial value is 0), and the light intensity value collected by the light sensor does not change, and when the user's hand moves vertically with respect to the terminal, the light intensity value collected by the light sensor and the distance value collected by the distance sensor may change at the same time. Therefore, the motion direction of the current gesture can be determined according to the time sequence of the change of the light intensity value and the distance value, and then a corresponding gesture triggering instruction is generated.
And 103, generating a gesture triggering instruction according to the movement direction, wherein the gesture triggering instruction is used for indicating the terminal to execute a corresponding function.
Specifically, as shown in fig. 5, in some embodiments of the present application, the generating the gesture trigger instruction according to the movement direction may include: step 501, step 502 and/or step 503.
In step 501, a first gesture trigger command is generated according to a gesture moving in a vertical direction.
For example, when the time when the light intensity value and the distance value change is the same time, it may be determined that the movement direction of the current gesture is a vertical direction, and the first gesture trigger instruction may be generated according to the gesture that moves in the vertical direction.
Step 502, generating a second gesture triggering instruction according to the gesture moving in the horizontal direction.
For example, when the time at which the light intensity value and the distance value change is not the same time, the movement direction of the current gesture may be determined to be a horizontal direction, and a second gesture trigger instruction may be generated according to the gesture that moves in the horizontal direction.
Step 503, generating a third gesture trigger according to the combination of the gesture moving in the vertical direction and the gesture moving in the horizontal direction.
For example, after detecting a gesture of movement in a vertical direction, and detecting a gesture of movement in a horizontal direction, a third gesture trigger instruction may be generated.
It should be noted that, in the embodiment of the present application, the first gesture trigger instruction, the second gesture trigger instruction, and the third gesture trigger instruction are only used to indicate that the terminal detects a gesture in a corresponding motion direction, and for what function the gesture should indicate to the terminal to execute, the determination may be performed according to the state in which the terminal is located or the function of an application running in the foreground of the terminal, which is not limited in this aspect of the present application.
For example, when an application running in the foreground of the terminal is a preset application, if the terminal detects any one of the first gesture triggering instruction, the second gesture triggering instruction and the third gesture triggering instruction, the preset application may be automatically exited to the background running, or the application may be directly closed, so as to realize quick closing of the application.
For example, the preset application may be a photographing application, an instant chat application, or a payment type application, so as to protect the privacy of the user in time.
According to the embodiment of the application, the movement direction of the current gesture is determined according to the time sequence of the change of the light intensity value and the distance value, and the gesture trigger instruction is generated according to the movement direction, so that the terminal is triggered to execute the corresponding function, the problem that the terminal cannot be triggered to realize the corresponding function when the user is stained with a glove or hands and cannot click or slide a touch screen is avoided, the operation requirements of the user in different scenes are met, and the triggering efficiency of the terminal is improved.
In order to more precisely determine the direction of motion of the user gesture and generate a greater variety of gesture trigger instructions, in some embodiments of the present application, as shown in fig. 6, step 401 described above may include steps 601 through 603.
In step 601, if the time when the light intensity value and the distance value change is the same time, the trend of the light intensity value and the distance value change is obtained.
In the embodiment of the present application, the above-described trend includes a monotonically increasing and monotonically decreasing trend.
Wherein monotonically increasing means that the light intensity value or the distance value changes from small to large, and monotonically decreasing means that the light intensity value or the distance value changes from large to small.
In step 602, if the trend of the light intensity value and the distance value is monotonically increasing, it is determined that the motion direction of the current gesture is a direction perpendicular to the terminal and away from the terminal.
For example, as shown in fig. 7, when the movement direction of the user's hand 71 is a direction 72 perpendicular to and away from the terminal, the light intensity value increases because the angle of view of the light sensor increases, and at this time, the distance value acquired by the distance sensor also gradually increases, so that if the change trend of the light intensity value and the distance value increases monotonically, it is possible to determine that the movement direction of the current gesture is a direction perpendicular to and away from the terminal.
In step 603, if the trend of the light intensity value and the distance value is monotonically decreasing, the motion direction of the current gesture is determined to be the direction perpendicular to the terminal and approaching the terminal.
For example, as shown in fig. 7, when the movement direction of the user's hand 71 is a direction 73 perpendicular to the terminal and approaching the terminal, the light intensity value becomes smaller because the angle of view of the light sensor becomes smaller, and at this time, the distance value acquired by the distance sensor becomes smaller gradually, so that if the change trend of the light intensity value and the distance value is monotonically decreasing, it is possible to determine that the movement direction of the current gesture is a direction perpendicular to the terminal and approaching the terminal.
Also, in some embodiments of the present application, as shown in fig. 7 and 8, the step 402 may further include steps 801 to 802.
In step 801, if the time when the light intensity value changes is earlier than the time when the distance value changes, the current gesture movement direction is determined to be the horizontal direction 74 from the light sensor to the distance sensor.
In step 802, if the time when the distance value changes is earlier than the time when the light intensity value changes, the motion direction of the current gesture is determined to be the horizontal direction 75 from the distance sensor to the light sensor.
After the terminal may detect the direction 72 perpendicular to the terminal and away from the terminal, the direction 73 perpendicular to the terminal and close to the terminal, the horizontal direction 74 from the light sensor to the distance sensor, and the horizontal direction 75 from the distance sensor to the light sensor, in some embodiments of the present application, the generating the first gesture trigger command according to the gesture of the movement in the perpendicular direction may further include: generating a first sub-gesture triggering instruction according to a gesture which is perpendicular to the terminal and moves in a direction away from the terminal; generating a second sub-gesture triggering instruction according to a gesture which is perpendicular to the terminal and moves in a direction close to the terminal; in step 502, generating the second gesture trigger command according to the gesture moving in the horizontal direction may further include: generating a third sub-gesture triggering instruction according to the gesture moving in the horizontal direction from the light sensor to the distance sensor; generating a fourth sub-gesture trigger instruction according to the gesture moving in the horizontal direction from the distance sensor to the optical sensor; in step 503, generating the third gesture trigger instruction according to the combination of the gesture moving in the vertical direction and the gesture moving in the horizontal direction may further include: and generating a third gesture trigger according to the combination of one or more of the first sub-gesture trigger instruction, the second sub-gesture trigger instruction, the third sub-gesture trigger instruction and the fourth sub-gesture trigger instruction.
And after the sub-gesture trigger instruction is generated, the terminal can execute corresponding functions according to the state of the terminal or the application running in the foreground of the terminal.
For example, when a call is placed by the terminal, the terminal may answer the call when the third sub-gesture trigger instruction generated by the user trigger is detected, or reject the call when the fourth sub-gesture trigger instruction generated by the user trigger is detected.
For another example, when the user uses the terminal to read the electronic book, a third sub-gesture trigger instruction can be generated by triggering the terminal so that the terminal can execute left page turning; or the trigger terminal generates a fourth sub-gesture trigger instruction so that the terminal executes page turning to the right, or the trigger terminal generates a first sub-gesture trigger instruction so that the terminal executes page turning to the left quickly, or the trigger terminal generates a second sub-gesture trigger instruction so that the terminal executes page turning to the right quickly, and various page turning functions of the terminal are realized.
In the process of generating the third gesture trigger instruction according to the combination of one or more of the first sub-gesture trigger instruction, the second sub-gesture trigger instruction, the third sub-gesture trigger instruction and the fourth sub-gesture trigger instruction, the terminal may detect whether the user triggers any one of the first sub-gesture trigger instruction, the second sub-gesture trigger instruction, the third sub-gesture trigger instruction and the fourth sub-gesture trigger instruction within a preset time period after detecting any one of the first sub-gesture trigger instruction, the second sub-gesture trigger instruction, the third sub-gesture trigger instruction and the fourth sub-gesture trigger instruction, and if so, combine the first sub-gesture trigger instruction, the second sub-gesture trigger instruction, the third sub-gesture trigger instruction and the fourth sub-gesture trigger instruction and generate the third gesture trigger instruction.
For example, when the motion trajectory of the user's hand is a gesture such as a T-shape, an inverted T-shape, or a Z-shape, the terminal may generate the third gesture trigger instruction obtained by combining the above, and execute a function corresponding to the third gesture trigger instruction.
In order to facilitate the user knowing that the terminal has been successfully triggered to perform the corresponding function, in some embodiments of the present application, after determining the movement direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value, the method may include: and displaying the motion trail of the current gesture according to the motion direction of the current gesture.
For example, when the user is reading the electronic book, if the user always triggers the terminal to generate the fourth sub-gesture trigger instruction so that the terminal performs the page turning to the right, however, if the terminal does not always perform the page turning to the right at this time because the electronic book has been read to the last page, in this case, the user may misunderstand that the gesture operation is insensitive, so if after determining the movement direction of the current gesture, the movement track of the current gesture of the user is displayed according to the movement direction of the current gesture, it may be convenient for the user to know that the user has successfully triggered the terminal to perform the page turning to the right function, and the electronic book has been read to the last page.
In addition, when the hand of the user is far away from the terminal, the terminal cannot detect the corresponding gesture and cannot display the motion trail of the current gesture, the terminal can be used for reminding the user whether the gesture triggering is successful or not in a mode of displaying the motion trail of the current gesture or not.
In the embodiment of the application, in order to avoid misoperation of the terminal, the detection ranges of the optical sensor and the distance sensor can be 1 cm-15 cm away from the terminal.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may occur in other orders in accordance with the application.
Fig. 9 shows a schematic structural diagram of a gesture detection apparatus 900 according to an embodiment of the present application, where the gesture detection apparatus 900 is configured on a terminal, and the terminal is provided with a light sensor and a distance sensor; comprises an acquisition unit 901, a determination unit 902 and a generation unit 903.
An acquisition unit 901 for acquiring a light intensity value acquired by the light sensor and a distance value acquired by the distance sensor;
a determining unit 902, configured to determine a motion direction of the current gesture according to a time sequence in which the light intensity value and the distance value change;
the generating unit 903 generates a gesture triggering instruction according to the motion direction, where the gesture triggering instruction is used to instruct the terminal to execute a corresponding function.
In some embodiments of the present application, the determining unit 902 is further configured to determine that the motion direction of the current gesture is a vertical direction if the time when the light intensity value and the distance value change is the same time; and if the time of the change of the light intensity value and the distance value is not the same time, determining that the motion direction of the current gesture is the horizontal direction.
In some embodiments of the present application, the determining unit 902 is further configured to obtain a trend of the light intensity value and the distance value if the time of the light intensity value and the distance value is the same; if the change trend of the light intensity value and the distance value is monotonously increasing, determining that the motion direction of the current gesture is perpendicular to the terminal and far away from the terminal; and if the change trend of the light intensity value and the change trend of the distance value are monotonically decreasing, determining that the movement direction of the current gesture is perpendicular to the terminal and is close to the terminal.
In some embodiments of the present application, the determining unit 902 is further configured to determine that the movement direction of the current gesture is a horizontal direction from the light sensor to the distance sensor if the time when the light intensity value changes is earlier than the time when the distance value changes; and if the time when the distance value changes is earlier than the time when the light intensity value changes, determining that the movement direction of the current gesture is the horizontal direction from the distance sensor to the light sensor.
In some embodiments of the present application, the generating unit 903 is further configured to generate a first gesture trigger instruction according to a gesture that moves in a vertical direction; and/or generating a second gesture triggering instruction according to the gesture moving in the horizontal direction; and/or generating a third gesture trigger according to the combination of the gesture moving in the vertical direction and the gesture moving in the horizontal direction.
In some embodiments of the present application, the generating unit 903 is further configured to generate a first sub-gesture trigger instruction according to a gesture that moves perpendicular to the terminal and away from the terminal; generating a second sub-gesture triggering instruction according to a gesture which is perpendicular to the terminal and moves in a direction close to the terminal; generating a third sub-gesture triggering instruction according to the gesture moving in the horizontal direction from the light sensor to the distance sensor; generating a fourth sub-gesture trigger instruction according to the gesture moving in the horizontal direction from the distance sensor to the optical sensor; and generating a third gesture trigger according to the combination of one or more of the first sub-gesture trigger instruction, the second sub-gesture trigger instruction, the third sub-gesture trigger instruction and the fourth sub-gesture trigger instruction.
In some embodiments of the present application, the gesture detection apparatus further includes a display unit, configured to display a motion trajectory of the current gesture according to a motion direction of the current gesture after determining the motion direction of the current gesture according to the time sequence of the change in the light intensity value and the distance value.
It should be noted that, for convenience and brevity of description, the specific working process of the gesture detection apparatus 900 described above may refer to the corresponding process of the method described in fig. 1 to 8, and will not be described herein again.
As shown in fig. 10, the present application provides a terminal for implementing the gesture detection method, where the terminal may include: a processor 11, a light sensor 12, a distance sensor 13, a memory 14, one or more input devices 16 (only one shown in fig. 10) and one or more output devices 15 (only one shown in fig. 10). The processor 11, the memory 14, the input device 16 and the output device 15 are connected by a bus 17.
It should be appreciated that in embodiments of the present application, the processor 11 may be a central processing unit (Central Processing Unit, CPU), which may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSPs), application specific integrated circuits (Application Specific Integrated Circuit, ASICs), field programmable gate arrays (Field-Programmable Gate Array, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 16 may include a virtual keyboard, a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of a fingerprint), a microphone, etc., and the output device 15 may include a display, a speaker, etc.
Memory 14 may include read only memory and random access memory and provides instructions and data to processor 11. Some or all of memory 14 may also include non-volatile random access memory. For example, the memory 14 may also store information of the device type.
The memory 14 stores a computer program that is executable on the processor 11, for example, a program of a gesture detection method. The steps in the gesture detection method embodiment, for example, steps 101 to 103 shown in fig. 1, are implemented when the processor 11 executes the computer program. Alternatively, the processor 11 may implement the functions of the units in the above-described apparatus embodiments, such as the functions of the units 901 to 903 shown in fig. 9, when executing the above-described computer program.
The computer program may be divided into one or more modules/units which are stored in the memory 14 and executed by the processor 11 to complete the present application. The one or more modules/units may be a series of instruction segments of a computer program capable of performing a specific function, where the instruction segments are used to describe the execution of the computer program in the first terminal for gesture detection. For example, the above-described computer program may be divided into an acquisition unit, a determination unit, and a generation unit, each unit having the following specific functions:
the acquisition unit acquires a light intensity value acquired by the light sensor and a distance value acquired by the distance sensor;
the determining unit is used for determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value;
and the generating unit is used for generating a gesture triggering instruction according to the movement direction, wherein the gesture triggering instruction is used for indicating the terminal to execute a corresponding function.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal and method may be implemented in other manners. For example, the apparatus/terminal embodiments described above are merely illustrative, e.g., the division of the modules or units described above is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of each method embodiment may be implemented. The computer program comprises computer program code, and the computer program code can be in a source code form, an object code form, an executable file or some intermediate form and the like. The computer readable medium may include: any entity or device capable of carrying the computer program code described above, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier wave signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium described above can be appropriately increased or decreased according to the requirements of the jurisdiction's legislation and the patent practice, for example, in some jurisdictions, the computer readable medium does not include electrical carrier signals and telecommunication signals according to the legislation and the patent practice.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (9)
1. The gesture detection method is applied to a terminal and is characterized in that the terminal is provided with a light sensor and a distance sensor; the gesture detection method comprises the following steps:
acquiring a light intensity value acquired by a light sensor and a distance value acquired by a distance sensor;
determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value;
generating a gesture triggering instruction according to the movement direction, wherein the gesture triggering instruction is used for indicating a terminal to execute a corresponding function;
the determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value comprises:
if the time of the change of the light intensity value and the distance value is the same time, determining that the motion direction of the current gesture is a vertical direction;
and if the time of the change of the light intensity value and the distance value is not the same time, determining that the motion direction of the current gesture is the horizontal direction.
2. The gesture detection method as set forth in claim 1, wherein determining that the motion direction of the current gesture is a vertical direction if the time at which the light intensity value and the distance value change is the same time comprises:
if the time of the change of the light intensity value and the distance value is the same time, acquiring the change trend of the light intensity value and the distance value;
if the change trend of the light intensity value and the distance value is monotonously increasing, determining that the motion direction of the current gesture is perpendicular to the terminal and far away from the terminal;
and if the change trend of the light intensity value and the change trend of the distance value are monotonically decreasing, determining that the movement direction of the current gesture is perpendicular to the terminal and is close to the terminal.
3. The gesture detection method of claim 1, wherein determining that the movement direction of the current gesture is a horizontal direction if the time at which the light intensity value and the distance value change is not the same time comprises:
if the time of the change of the light intensity value is earlier than the time of the change of the distance value, determining that the movement direction of the current gesture is the horizontal direction from the light sensor to the distance sensor;
and if the time when the distance value changes is earlier than the time when the light intensity value changes, determining that the movement direction of the current gesture is the horizontal direction from the distance sensor to the light sensor.
4. The gesture detection method of claim 1, wherein the generating a gesture trigger instruction according to the movement direction comprises:
generating a first gesture triggering instruction according to the gesture moving in the vertical direction; and/or the number of the groups of groups,
generating a second gesture triggering instruction according to the gesture moving in the horizontal direction; and/or the number of the groups of groups,
and generating a third gesture triggering instruction according to the combination of the gesture moving in the vertical direction and the gesture moving in the horizontal direction.
5. The gesture detection method of claim 4, wherein the generating a first gesture trigger instruction according to the gesture of the vertical movement comprises:
generating a first sub-gesture triggering instruction according to a gesture which is perpendicular to the terminal and moves in a direction away from the terminal;
generating a second sub-gesture triggering instruction according to a gesture which is perpendicular to the terminal and moves in a direction close to the terminal;
the generating a second gesture triggering instruction according to the gesture moving in the horizontal direction comprises the following steps:
generating a third sub-gesture triggering instruction according to the gesture moving in the horizontal direction from the light sensor to the distance sensor;
generating a fourth sub-gesture trigger instruction according to the gesture moving in the horizontal direction from the distance sensor to the optical sensor;
the generating a third gesture triggering instruction according to the combination of the gesture moving in the vertical direction and the gesture moving in the horizontal direction comprises the following steps:
and generating a third gesture trigger according to the combination of one or more of the first sub-gesture trigger instruction, the second sub-gesture trigger instruction, the third sub-gesture trigger instruction and the fourth sub-gesture trigger instruction.
6. The gesture detection method according to any one of claims 1 to 5, comprising, after the determining of the movement direction of the current gesture according to the time sequence in which the light intensity value and the distance value change, the steps of:
and displaying the motion trail of the current gesture according to the motion direction of the current gesture.
7. A gesture detection device configured at a terminal, wherein the terminal is provided with a light sensor and a distance sensor; the gesture detection device includes:
the acquisition unit acquires a light intensity value acquired by the light sensor and a distance value acquired by the distance sensor;
the determining unit is used for determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value;
the generating unit is used for generating a gesture triggering instruction according to the movement direction, wherein the gesture triggering instruction is used for indicating the terminal to execute a corresponding function;
the determining the motion direction of the current gesture according to the time sequence of the change of the light intensity value and the distance value comprises:
if the time of the change of the light intensity value and the distance value is the same time, determining that the motion direction of the current gesture is a vertical direction;
and if the time of the change of the light intensity value and the distance value is not the same time, determining that the motion direction of the current gesture is the horizontal direction.
8. A terminal comprising a light sensor, a distance sensor, a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any one of claims 1 to 6 when the computer program is executed.
9. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910644106.8A CN110362206B (en) | 2019-07-16 | 2019-07-16 | Gesture detection method, gesture detection device, terminal and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910644106.8A CN110362206B (en) | 2019-07-16 | 2019-07-16 | Gesture detection method, gesture detection device, terminal and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110362206A CN110362206A (en) | 2019-10-22 |
CN110362206B true CN110362206B (en) | 2023-09-01 |
Family
ID=68219979
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910644106.8A Active CN110362206B (en) | 2019-07-16 | 2019-07-16 | Gesture detection method, gesture detection device, terminal and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110362206B (en) |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014202490A1 (en) * | 2014-02-12 | 2015-08-13 | Volkswagen Aktiengesellschaft | Apparatus and method for signaling a successful gesture input |
CN105807907B (en) * | 2014-12-30 | 2018-09-25 | 富泰华工业(深圳)有限公司 | Body-sensing symphony performance system and method |
CN106325467B (en) * | 2015-06-15 | 2021-10-29 | 中兴通讯股份有限公司 | Method and device for controlling mobile terminal and mobile terminal |
CN107479714A (en) * | 2017-08-28 | 2017-12-15 | 歌尔科技有限公司 | Content switching display methods, device and robot |
-
2019
- 2019-07-16 CN CN201910644106.8A patent/CN110362206B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN110362206A (en) | 2019-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108710469B (en) | Application program starting method, mobile terminal and medium product | |
KR102045232B1 (en) | Gesture identification methods, devices, programs and recording media | |
WO2019001152A1 (en) | Photographing method and mobile terminal | |
RU2556079C2 (en) | User data input | |
CN107544809A (en) | The method and apparatus for showing the page | |
CN105739897A (en) | Touch operation processing method and device, and terminal | |
CN110245607B (en) | Eyeball tracking method and related product | |
US9535493B2 (en) | Apparatus, method, computer program and user interface | |
CN105824422B (en) | A kind of information processing method and electronic equipment | |
US9846529B2 (en) | Method for processing information and electronic device | |
EP3575917A1 (en) | Collecting fingerprints | |
CN110286840A (en) | Gesture zooming control method and device of touch equipment and related equipment | |
CN113359995B (en) | Man-machine interaction method, device, equipment and storage medium | |
US20150207923A1 (en) | Method, terminal and computer storage medium for triggering a communication with a contact | |
CN112437231B (en) | Image shooting method and device, electronic equipment and storage medium | |
CN111190677A (en) | Information display method, information display device and terminal equipment | |
WO2015131590A1 (en) | Method for controlling blank screen gesture processing and terminal | |
CN111597009B (en) | Application program display method and device and terminal equipment | |
CN110362206B (en) | Gesture detection method, gesture detection device, terminal and computer readable storage medium | |
CN104915138B (en) | Information processing method and electronic equipment | |
CN116645672A (en) | Method, system, terminal and medium based on image frame selection recognition and automatic input | |
CN108446067B (en) | Touch display terminal, virtual key position adjusting method thereof and storage medium | |
CN114546240B (en) | Interactive implementation method, device and equipment for game and storage medium | |
CN113849082B (en) | Touch processing method and device, storage medium and mobile terminal | |
CN106293629B (en) | Storehouse acquisition methods and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |