WO2017124773A1 - 手势识别方法及装置 - Google Patents
手势识别方法及装置 Download PDFInfo
- Publication number
- WO2017124773A1 WO2017124773A1 PCT/CN2016/100994 CN2016100994W WO2017124773A1 WO 2017124773 A1 WO2017124773 A1 WO 2017124773A1 CN 2016100994 W CN2016100994 W CN 2016100994W WO 2017124773 A1 WO2017124773 A1 WO 2017124773A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- ambient light
- light sensor
- gesture
- module
- state
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
Definitions
- the present disclosure relates to the field of display technologies, and in particular, to a gesture recognition method and apparatus.
- touch screens have more and more functions, such as gesture recognition.
- the touch screen first determines a touch position at which the user's finger touches the touch screen, and then recognizes a gesture made by the user according to the touch position.
- the present disclosure provides a gesture recognition method and apparatus.
- a method for recognizing a gesture is provided in a terminal that includes a touch screen, wherein the ambient light sensor is distributed in the touch screen, the method includes:
- the preset change rule is an incident ambient light
- the light of the sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state;
- the user's operational gesture is identified based on the location of the at least one ambient light sensor.
- detecting whether at least one ambient light sensor meets a preset change rule including:
- the light intensity value first becomes smaller and then becomes larger, it is determined that the light incident on the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the preset change rule is satisfied.
- the user's operation gesture is identified according to the location of the at least one ambient light sensor, including:
- the position of the at least one ambient light sensor is identified as an operational gesture in sequence.
- the method further includes:
- the method further includes:
- the user's occlusion gesture is recognized when the respective minimum values remain unchanged for the same time period.
- a gesture recognition apparatus for use in a terminal including a touch screen, wherein the ambient light sensor is distributed in the touch screen, and the apparatus includes:
- the first detecting module is configured to detect whether the at least one ambient light sensor satisfies a preset change rule when there is a ray that is incident on the at least one ambient light sensor is blocked, and a touch operation on the touch screen is not detected, the pre-detection
- the change rule is that the light incident on the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state;
- a first determining module configured to determine a location of the at least one ambient light sensor when the result of the first detecting module detecting that the at least one ambient light sensor meets the preset change rule
- the first identification module is configured to identify an operation gesture of the user according to the location of the at least one ambient light sensor determined by the first determining module.
- the first detecting module includes:
- a first acquiring submodule configured to acquire, for each ambient light sensor, a light intensity value measured by the ambient light sensor
- the detecting submodule is configured to detect whether the light intensity value acquired by the first acquiring submodule is first reduced and then increased;
- the determining submodule is configured to: when the detection result of the detecting submodule is that the light intensity value first becomes smaller and then becomes larger, determining that the light incident on the ambient light sensor is first switched from the unoccluded state to the occluded state, and then occluded The state is switched to the unoccluded state, and the preset change rule is satisfied.
- the first identification module includes:
- a second acquiring submodule configured to acquire an order in which at least one ambient light sensor is sequentially occluded
- the identification sub-module is configured to identify the location of the at least one ambient light sensor as an operational gesture in an order acquired by the second acquisition sub-module.
- the device further includes:
- the first obtaining module is configured to acquire, from the at least one ambient light sensor, a change duration of the light intensity value of each ambient light sensor;
- a calculation module configured to calculate an average value of each change duration acquired by the first acquisition module
- a second determining module configured to determine an operating speed of the gesture according to an average value calculated by the computing module
- the third determining module is configured to determine a response manner to the gesture according to the operating speed determined by the second determining module.
- the device further includes:
- a second acquiring module configured to acquire, from the at least one ambient light sensor, a minimum value of a change in the light intensity value of each ambient light sensor
- a second detecting module configured to detect whether each minimum value acquired by the second acquiring module is maintained in the same time period constant
- the second identification module is configured to recognize the occlusion gesture of the user when the result detected by the second detection module is that the respective minimum values remain unchanged for the same period of time.
- a gesture recognition apparatus for use in a terminal including a touch screen, wherein the ambient light sensor is distributed in the touch screen, the apparatus includes:
- a memory for storing processor executable instructions
- processor is configured to:
- the preset change rule is an incident ambient light
- the light of the sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state;
- the user's operational gesture is identified based on the location of the at least one ambient light sensor.
- the terminal cannot realize the problem of recognizing the operation gesture made by the user, and the method of increasing the gesture recognition and improving the flexibility of the gesture recognition is achieved.
- the operation gesture can be extended, which solves the problem that the user does not have a small operation gesture when the touch screen is not touched, and the touch screen responds less to the operation gesture.
- FIG. 1 is a flowchart of a gesture recognition method according to an exemplary embodiment.
- FIG. 2A is a flowchart of a gesture recognition method according to another exemplary embodiment.
- FIG. 2B is a scene diagram of determining an operation position of an operation gesture, according to an exemplary embodiment.
- FIG. 2C is a scene diagram illustrating an operational position of determining an operation gesture, according to another exemplary embodiment.
- FIG. 2D is a scene diagram of a recognition operation gesture according to an exemplary embodiment.
- FIG. 2E is a scene diagram illustrating a recognition operation gesture, according to another exemplary embodiment.
- FIG. 2F is a scene diagram of a recognition operation gesture according to another exemplary embodiment.
- FIG. 2G is a flowchart illustrating a method of identifying a speed of an operation gesture, according to an exemplary embodiment.
- FIG. 2H is a flowchart of a method for identifying an occlusion gesture according to an exemplary embodiment.
- FIG. 3 is a block diagram of a gesture recognition apparatus, according to an exemplary embodiment.
- FIG. 4 is a block diagram of a gesture recognition apparatus, according to an exemplary embodiment.
- FIG. 5 is a block diagram of an apparatus for gesture recognition, according to an exemplary embodiment.
- FIG. 1 is a flowchart of a gesture recognition method, which is applied to a terminal including a touch screen, in which an ambient light sensor is distributed, as shown in FIG. 1 , which is illustrated in FIG. 1 , according to an exemplary embodiment.
- the method includes the following steps.
- step 101 when there is a ray that is incident on the at least one ambient light sensor is blocked, and a touch operation on the touch screen is not detected, detecting whether the at least one ambient light sensor satisfies a preset change rule, the preset change rule
- the light that is incident on the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state.
- step 102 the position of the at least one ambient light sensor is determined when the at least one ambient light sensor satisfies a preset change rule.
- step 103 the user's operational gesture is identified based on the location of the at least one ambient light sensor.
- the gesture recognition method is configured to detect each incident ambient light sensor by detecting when a light incident on at least one ambient light sensor is blocked and a touch operation on the touch screen is not detected.
- the light is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the user's operation gesture is recognized according to the position of the at least one ambient light sensor, so that the user does not need to perform a touch on the touch screen.
- the terminal can recognize the operation gesture made by the user, and solve the problem that when the user is inconvenient to perform the touch operation on the touch screen, the terminal cannot realize the operation gesture of recognizing the user, and the method for increasing the gesture recognition is improved.
- FIG. 2A is a flowchart of a gesture recognition method applied to a terminal including a touch screen in which an ambient light sensor is distributed, as shown in FIG. 2A, according to another exemplary embodiment.
- the identification method includes the following steps.
- step 201 when there is a ray that is incident on the at least one ambient light sensor is blocked, and a touch operation on the touch screen is not detected, detecting whether the at least one ambient light sensor satisfies a preset change rule, the preset change
- the rule is that the light entering the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state.
- the user's finger needs to touch the touch screen, and the terminal can recognize the user's operation gesture, that is, the operation gesture needs to be applied to the touch screen.
- the user's finger is dirty and is inconvenient to touch the touch screen, the user needs to wipe the finger first, and then perform a touch operation on the touch screen, so that the user cannot operate the terminal in time, or the user directly performs a touch operation on the touch screen. causes the touch screen to be contaminated.
- the embodiment provides a method for recognizing a user's operation gesture without directly touching the touch screen with a finger, and the specific method is as follows.
- the ambient light sensor distributed in the touch screen can measure the light intensity of the light incident on the ambient light sensor. When there is an object blocking the touch screen, the measured light intensity value of the occluded ambient light sensor will become smaller. Therefore, the ambient light sensor can determine whether it is occluded according to the light intensity value, and report the occlusion event to the terminal when it is occluded. .
- the terminal After receiving the occlusion event reported by the at least one ambient light sensor, the terminal detects whether there is a touch operation on the touch screen. If there is a touch operation on the touch screen, the occlusion event is an operation gesture applied to the touch screen; if there is no touch operation on the touch screen, It is indicated that the occlusion event is an operation gesture that does not act on the touch screen.
- the operation gesture is a sliding gesture.
- the light incident on the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state.
- Switching to the unoccluded state it is possible to identify whether the operation gesture made by the user is a swipe gesture by detecting whether the at least one ambient light sensor satisfies the preset change rule.
- the method for detecting whether at least one ambient light sensor meets a preset change rule includes:
- For each ambient light sensor obtain the light intensity value measured by the ambient light sensor; detect whether the light intensity value first becomes smaller and then larger; when the light intensity value first becomes smaller and then becomes larger, determine the light incident into the ambient light sensor First, the unoccluded state is switched to the occluded state, and then the occluded state is switched to the unoccluded state, and the preset change rule is satisfied.
- the ambient light sensor When the light incident on the ambient light sensor is not blocked, the ambient light sensor measures a greater light intensity value; when the light incident on the ambient light sensor is blocked, the ambient light sensor measures a smaller light intensity value,
- the change of the ambient light sensor can be determined according to the change of the light intensity value. That is, when the light intensity value changes from large to small, it is determined that the light incident on the ambient light sensor is switched from the unoccluded state to the occluded state; when the light intensity value is further changed from small to large, the light incident on the ambient light sensor is determined to be The occluded state is switched to the unoccluded state.
- step 202 the location of the at least one ambient light sensor is determined when the at least one ambient light sensor satisfies the preset change rule.
- the position of the ambient light sensor at the center position of each shaded portion may be taken as the position of the ambient light sensor that satisfies the preset change rule at the current time, that is, the operation position of the current time operation gesture.
- the user makes an operation gesture 2, and a shadow portion 3 is formed on the terminal 1.
- the terminal 1 acquires the position of the at least one ambient light sensor corresponding to the shaded portion 3, and calculates the at least The center point of an ambient light sensor position is 4, and the center point 4 is used as the operation position of the operation gesture at that moment.
- the user makes an operation gesture 5, and forms a shaded portion 6 and a shaded portion 7 on the terminal 1, and the terminal 1 acquires the position of the at least one ambient light sensor corresponding to the shaded portion 6, And calculating a center point of the at least one ambient light sensor position is 8; acquiring a position of the at least one ambient light sensor corresponding to the shaded portion 7, and calculating a center point of the at least one ambient light sensor position is 9, the center point is 8 and the center point 9 serve as operational positions of the operation gesture at that moment.
- the method for determining the shadow portion of the terminal may be a region composed of at least one ambient light sensor whose light intensity values are equal at the same time and the positions are consecutive.
- the present embodiment does not limit the method for determining the shadow portion.
- the terminal determines an operation position of the operation gesture at each moment according to the position of the at least one ambient light sensor, and the operation gesture of the user can be recognized according to the operation position of each moment.
- step 203 an order in which at least one ambient light sensor is sequentially occluded is acquired.
- the terminal may record the time when the occlusion event is received when the occlusion event is received. Therefore, when the user's operation gesture is generated by a continuous action, such as: a sliding operation, the terminal The time at which the at least one ambient light sensor is sequentially occluded may be acquired, and the order in which the at least one ambient light sensor is sequentially occluded is determined according to the acquired time.
- step 204 the position of the at least one ambient light sensor is identified as an operational gesture in sequence.
- an operation trajectory of the operation gesture when the touch screen is not touched may be obtained; according to the order in which at least one ambient light sensor determined in step 203 is sequentially occluded, an operation gesture may be obtained.
- the operation direction, according to the operation track and the operation direction, the terminal can recognize the operation gesture made by the user.
- the first time determines the operation position 11
- the second time determines the operation position 12
- the third time determines the operation position 13, according to the determined position and the order of the determined position.
- the user's operation gesture is recognized as a swipe gesture to the right.
- the terminal may further set a first angle threshold and a second angle threshold to determine an operation direction of the operation gesture, where the second angle threshold is greater than the first angle threshold.
- the terminal selects the operation position determined by any two moments. If the angle between the connection line and the horizontal direction of the two operation positions is smaller than the first angle threshold, the operation gesture is recognized as a leftward or rightward sliding operation; If the angle between the connection line and the horizontal direction is greater than the first angle threshold is less than the second angle threshold, the operation gesture is recognized as an oblique sliding operation; if the connection between the two operation positions is in the horizontal direction When the second angle threshold is greater than, the operation gesture is recognized as an upward or downward sliding operation.
- the terminal sets a first angle threshold value of 30 degrees in the terminal 1 and a second angle threshold value of 60 degrees, and determines an operation position 12 at an arbitrarily selected first moment, and an arbitrarily selected second.
- the operation position 14 is determined at the moment, and the angle between the connection line of the two operation positions and the horizontal direction is 45 degrees, which is greater than the first angle threshold value is smaller than the second angle threshold value. Therefore, according to the determined position, the order of the determined position, and the positional connection The angle between the line and the horizontal direction, the terminal recognizes that the user's operation gesture is a swipe gesture to the upper right.
- the terminal can also calculate the level of the angle between the connection line and the horizontal direction of the two operation positions of each group of adjacent moments.
- the average value is compared with the first angle threshold and the second angle threshold. If the average value is smaller than the first angle threshold, the sliding operation is recognized as left or right; if the average value is greater than the first angle threshold, the value is smaller than the first angle threshold.
- the second angle threshold is identified as a sliding operation in an oblique direction; if the average value is greater than the second angle threshold, it is recognized as an upward or downward sliding operation.
- the terminal determines at least two operation positions at the same time, for each operation position, the operation position closest to itself at an adjacent time and itself is determined as an operation combination, and the identification is determined according to the determined operation combinations.
- the user's gesture of operation is determined according to the determined operation combinations.
- the first time determines the operation positions 11 and 15, the second time determines the operation positions 12 and 16, and the third time determines the operation positions 13 and 17, and the terminal determines the operation.
- the positions 11, 12, and 13 are an operation combination; the operation positions 15, 16, and 17 are determined as an operation combination, and the operation gesture of the user is identified according to the determined combination of the two operations.
- FIG. 2G is a flowchart illustrating a method for recognizing the speed of the operation gesture according to an exemplary embodiment.
- the method for recognizing the gesture speed is applied to a terminal including a touch screen.
- the ambient light sensor is distributed in the touch screen.
- the method for identifying the gesture speed includes the following steps.
- step 205 a change duration of the light intensity values of the respective ambient light sensors is acquired from the at least one ambient light sensor.
- the terminal For each of the ambient light sensors, in a possible implementation manner, the terminal records the time when the light intensity value of the ambient light sensor starts to change and stops changing. When the ambient light sensor satisfies the preset change rule, the terminal may record according to the record. The time is calculated as the duration of change of the light intensity value of the ambient light sensor. In another possible implementation manner, the terminal may acquire a change duration of the light intensity value of the ambient light sensor before detecting that the ambient light sensor meets the preset change rule, and when detecting that the ambient light sensor meets the preset change rule , directly read the change duration obtained before. This embodiment does not limit the timing at which the terminal obtains the change duration of the light intensity value.
- step 206 an average of the respective varying durations is calculated.
- the terminal calculates an average value of the obtained change durations, or the terminal selects at least one change duration from the acquired change durations, and calculates an average value of the at least one change duration.
- step 207 the operating speed of the gesture is determined based on the average.
- a time threshold is preset in the terminal, and the operation speed of the gesture can be determined by comparing the average value with the time threshold.
- the time threshold may be one or more.
- two time thresholds are set in the terminal, namely: a first time threshold and a second time threshold, and the second time threshold is less than the second time threshold. If the average value is greater than the first time threshold, the terminal determines that the gesture is a slow motion gesture; if the average value is less than the second time threshold, the terminal determines that the gesture is a fast gesture; if the average value is greater than the second time threshold and is less than the first time threshold, Then the terminal determines that the gesture is a medium speed gesture.
- a response to the gesture is determined based on the speed of operation.
- the manner in which the terminal responds to the operation gesture is increased.
- the response mode corresponding to the fast right-slip gesture is video fast forward; the response mode corresponding to the slow right-slip gesture is to jump to the next video.
- the embodiment further provides a method for identifying an occlusion gesture made by a user, and a flowchart for identifying an occlusion gesture, as shown in FIG. 2H, the method includes the following steps.
- step 209 a minimum value of the change in the light intensity value of each of the ambient light sensors is acquired from the at least one ambient light sensor.
- the terminal determines that the at least one ambient light sensor meets the preset change rule, the terminal acquires a minimum value of the light intensity value of each ambient light sensor during the change process, where the minimum value is when the incident light is blocked, and the ambient light sensor The measured light intensity value.
- the minimum values of the light intensity values measured by the respective ambient light sensors are the same.
- step 210 it is detected whether the respective minimum values remain unchanged for the same period of time.
- the terminal detects that there is a minimum value of at least one light intensity value at the same time, it continues to detect whether the at least one minimum value remains unchanged for the same time period.
- step 211 the user's occlusion gesture is recognized when the respective minimum values remain unchanged for the same time period.
- the terminal may recognize that the user's operation gesture is an occlusion gesture when each minimum value remains unchanged for the same period of time.
- the terminal expands the response manner to the operation gesture by recognizing the occlusion gesture of the user. For example, when the terminal recognizes that the user performs the occlusion gesture, the corresponding response manner is to pause playing the video or music, or when the terminal recognizes the user When an occlusion gesture is made, the corresponding response method is to click on an application.
- a first predetermined time may be set in the terminal, and when the duration of each of the minimum values is greater than or equal to the first predetermined time, the operation gesture of the user is identified as the first type of occlusion gesture, or A second predetermined time is set in the terminal, and when the duration of each of the minimum values is less than or equal to the second predetermined time, the user's operation gesture is recognized as the second type of occlusion gesture.
- the terminal sets different response modes for different types of occlusion gestures.
- the gesture recognition method is configured to detect each incident ambient light sensor by detecting when a light incident on at least one ambient light sensor is blocked and a touch operation on the touch screen is not detected.
- the light is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the user's operation gesture is recognized according to the position of the at least one ambient light sensor, so that the user does not need to perform a touch on the touch screen.
- the terminal can recognize the operation gesture made by the user, and solve the problem that when the user is inconvenient to perform the touch operation on the touch screen, the terminal cannot realize the operation gesture of recognizing the user, and the method for increasing the gesture recognition is improved.
- the operation gesture can be extended, which solves the problem that the user does not have a small operation gesture when the touch screen is not touched, and the touch screen responds less to the operation gesture.
- FIG. 3 is a block diagram of a gesture recognition apparatus applied to a terminal including a touch screen in which an ambient light sensor is distributed, as shown in FIG. 3, the gesture recognition apparatus is illustrated, according to an exemplary embodiment.
- the first detecting module 310, the first determining module 320, and the first identifying module 330 are included.
- the first detecting module 310 is configured to block when light entering the at least one ambient light sensor is blocked, and When the touch operation on the touch screen is not detected, detecting whether the at least one ambient light sensor satisfies a preset change rule, wherein the preset change rule is that the light incident on the ambient light sensor is first switched from the unoccluded state to the occluded state. Switching from the occluded state to the unoccluded state;
- the first determining module 320 is configured to determine a location of the at least one ambient light sensor when the result of the first detecting module 310 detecting that the at least one ambient light sensor meets the preset change rule;
- the first identification module 330 is configured to recognize an operation gesture of the user according to the location of the at least one ambient light sensor determined by the first determining module 320.
- the gesture recognition device detects each incident ambient light sensor by detecting when a light incident on at least one ambient light sensor is blocked and a touch operation on the touch screen is not detected.
- the light is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the user's operation gesture is recognized according to the position of the at least one ambient light sensor, so that the user does not need to perform a touch on the touch screen.
- the terminal can recognize the operation gesture made by the user, and solve the problem that when the user is inconvenient to perform the touch operation on the touch screen, the terminal cannot realize the operation gesture of recognizing the user, and the method for increasing the gesture recognition is improved.
- the effect of the flexibility of gesture recognition is provided by the present disclosure.
- FIG. 4 is a block diagram of a gesture recognition apparatus, which is applied to a terminal including a touch screen, in which an ambient light sensor is distributed, as shown in FIG. 4, the gesture recognition apparatus is illustrated, according to an exemplary embodiment.
- the first detecting module 410, the first determining module 420, and the first identifying module 430 are included.
- the first detecting module 410 is configured to detect whether the at least one ambient light sensor satisfies a preset change rule when there is a ray that is incident on the at least one ambient light sensor is blocked, and a touch operation on the touch screen is not detected,
- the preset change rule is that the light incident on the ambient light sensor is first switched from the unoccluded state to the occluded state, and then switched from the occluded state to the unoccluded state;
- the first determining module 420 is configured to determine a location of the at least one ambient light sensor when the result of the first detecting module 410 detecting that the at least one ambient light sensor meets the preset change rule;
- the first identification module 430 is configured to recognize an operation gesture of the user according to the location of the at least one ambient light sensor determined by the first determining module 420.
- the first detecting module 410 includes: a first obtaining submodule 411, a detecting submodule 412, and a determining submodule 413.
- the first obtaining submodule 411 is configured to acquire, for each ambient light sensor, a light intensity value measured by the ambient light sensor;
- the detecting sub-module 412 is configured to detect whether the light intensity value acquired by the first acquiring sub-module 411 first becomes smaller and then becomes larger;
- the determining sub-module 413 is configured to determine that the light incident on the ambient light sensor is first switched from the unoccluded state to the occluded state when the detected result of the detecting sub-module 412 is that the light intensity value first becomes smaller and then becomes larger.
- the occluded state is switched to the unoccluded state, and the preset change rule is satisfied.
- the first identification module 430 includes: a second obtaining submodule 431 and an identifying submodule 432.
- the second obtaining submodule 431 is configured to acquire an order in which at least one ambient light sensor is sequentially occluded;
- the identification sub-module 432 is configured to recognize the position of the at least one ambient light sensor as an operation gesture in the order acquired by the second acquisition sub-module 431.
- the device further includes: a first obtaining module 440, a calculating module 450, a second determining module 460, and a third determining module 470.
- the first obtaining module 440 is configured to acquire, from the at least one ambient light sensor, a change duration of the light intensity value of each ambient light sensor;
- the calculation module 450 is configured to calculate an average value of each change duration acquired by the first acquisition module 440;
- the second determining module 460 is configured to determine an operating speed of the gesture according to the average value calculated by the calculating module 450;
- the third determining module 470 is configured to determine a response manner to the gesture according to the operating speed determined by the second determining module 460.
- the device further includes: a second obtaining module 480, a second detecting module 490, and a second identifying module 491.
- the second obtaining module 480 is configured to acquire, from the at least one ambient light sensor, a minimum value of a change in the light intensity value of each ambient light sensor;
- the second detecting module 490 is configured to detect whether each of the minimum values acquired by the second obtaining module 480 remains unchanged during the same time period;
- the second identification module 491 is configured to recognize the occlusion gesture of the user when the result detected by the second detection module 490 is that the respective minimum values remain unchanged for the same period of time.
- the gesture recognition device detects each incident ambient light sensor by detecting when a light incident on at least one ambient light sensor is blocked and a touch operation on the touch screen is not detected.
- the light is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the user's operation gesture is recognized according to the position of the at least one ambient light sensor, so that the user does not need to perform a touch on the touch screen.
- the terminal can recognize the operation gesture made by the user, and solve the problem that when the user is inconvenient to perform the touch operation on the touch screen, the terminal cannot realize the operation gesture of recognizing the user, and the method for increasing the gesture recognition is improved.
- the effect of the flexibility of gesture recognition is provided by the present disclosure.
- the operation gesture can be extended, which solves the problem that the user does not have a small operation gesture when the touch screen is not touched, and the touch screen responds less to the operation gesture.
- An exemplary embodiment of the present disclosure provides a gesture recognition apparatus capable of implementing the gesture recognition method provided by the present disclosure, the gesture recognition apparatus being applied to a terminal including a touch screen in which an ambient light sensor is distributed, the gesture recognition apparatus
- the method includes: a processor, a memory for storing processor executable instructions;
- processor is configured to:
- the preset change rule is an incident ambient light
- the light of the sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state;
- the user's operational gesture is identified based on the location of the at least one ambient light sensor.
- FIG. 5 is a block diagram of an apparatus 500 for gesture recognition, according to an exemplary embodiment.
- device 500 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a gaming console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
- apparatus 500 can include one or more of the following components: processing component 502, memory 504, power component 506, multimedia component 508, audio component 510, input/output (I/O) interface 512, sensor component 514, And a communication component 516.
- Processing component 502 typically controls the overall operation of device 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
- Processing component 502 can include one or more processors 518 to execute instructions to perform all or part of the steps described above.
- processing component 502 can include one or more modules to facilitate interaction between component 502 and other components.
- processing component 502 can include a multimedia module to facilitate interaction between multimedia component 508 and processing component 502.
- Memory 504 is configured to store various types of data to support operation at device 500. Examples of such data include instructions for any application or method operating on device 500, contact data, phone book data, messages, pictures, videos, and the like.
- the memory 504 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
- SRAM static random access memory
- EEPROM electrically erasable programmable read only memory
- EPROM Electrically erasable programmable read only memory
- PROM Programmable Read Only Memory
- ROM Read Only Memory
- Magnetic Memory Flash Memory
- Disk Disk or Optical Disk.
- Power component 506 provides power to various components of device 500.
- Power component 506 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 500.
- the multimedia component 508 includes a screen between the device 500 and the user that provides an output interface.
- the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
- the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
- the multimedia component 508 includes a front camera and/or a rear camera. When the device 500 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
- the audio component 510 is configured to output and/or input an audio signal.
- audio component 510 includes a microphone (MIC) that is used when device 500 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
- the wind is configured to receive an external audio signal.
- the received audio signal may be further stored in memory 504 or transmitted via communication component 516.
- audio component 510 also includes a speaker for outputting an audio signal.
- the I/O interface 512 provides an interface between the processing component 502 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
- Sensor assembly 514 includes one or more sensors for providing device 500 with various aspects of status assessment.
- sensor assembly 514 can detect an open/closed state of device 500, a relative positioning of components, such as the display and keypad of device 500, and sensor component 514 can also detect a change in position of one component of device 500 or device 500. The presence or absence of user contact with device 500, device 500 orientation or acceleration/deceleration, and temperature variation of device 500.
- Sensor assembly 514 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
- Sensor assembly 514 may also include an ambient light sensor for detecting the intensity of ambient light of device 500.
- the sensor component 514 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
- Communication component 516 is configured to facilitate wired or wireless communication between device 500 and other devices.
- the device 500 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
- communication component 516 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
- the communication component 516 also includes a near field communication (NFC) module to facilitate short range communication.
- NFC near field communication
- the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra-wideband
- Bluetooth Bluetooth
- apparatus 500 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGA field programmable A gate array
- controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
- non-transitory computer readable storage medium comprising instructions, such as a memory 504 comprising instructions executable by processor 518 of apparatus 500 to perform the above method.
- the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
Abstract
Description
Claims (11)
- 一种手势识别方法,其特征在于,用于包含触摸屏的终端中,所述触摸屏中分布有环境光传感器,所述方法包括:当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于所述触摸屏的触摸操作时,检测所述至少一个环境光传感器是否满足预设变化规则,所述预设变化规则为射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;当所述至少一个环境光传感器满足所述预设变化规则时,确定所述至少一个环境光传感器的位置;根据所述至少一个环境光传感器的位置识别用户的操作手势。
- 根据权利要求1所述的方法,其特征在于,所述检测所述至少一个环境光传感器是否满足预设变化规则,包括:对于每个环境光传感器,获取所述环境光传感器测得的光强值;检测所述光强值是否先变小再变大;当所述光强值先变小再变大时,确定射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,满足所述预设变化规则。
- 根据权利要求1所述的方法,其特征在于,所述根据所述至少一个环境光传感器的位置识别用户的操作手势,包括:获取所述至少一个环境光传感器依次被遮挡的顺序;按照所述顺序将所述至少一个环境光传感器的位置识别为所述操作手势。
- 根据权利要求1至3任一所述的方法,其特征在于,所述方法,还包括:从所述至少一个环境光传感器中,获取各个环境光传感器的光强值的变化时长;计算各个变化时长的平均值;根据所述平均值确定所述手势的操作速度;根据所述操作速度确定对所述手势的响应方式。
- 根据权利要求1所述的方法,其特征在于,所述方法,还包括:从所述至少一个环境光传感器中,获取各个环境光传感器的光强值变化的最小值;检测所述各个最小值是否在同一时段内保持不变;当所述各个最小值在所述同一时段内保持不变时,识别出所述用户的遮挡手势。
- 一种手势识别装置,其特征在于,用于包含触摸屏的终端中,所述触摸屏中分布有环境光传感器,所述装置包括:第一检测模块,被配置为当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于所述触摸屏的触摸操作时,检测所述至少一个环境光传感器是否满足预设变化规则,所述预设变化规则为射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;第一确定模块,被配置为当所述第一检测模块检测的结果为所述至少一个环境光传感器满足所述预设变化规则时,确定所述至少一个环境光传感器的位置;第一识别模块,被配置为根据所述第一确定模块确定的所述至少一个环境光传感器的位置识别用户的操作手势。
- 根据权利要求6所述的装置,其特征在于,所述第一检测模块,包括:第一获取子模块,被配置为对于每个环境光传感器,获取所述环境光传感器测得的光强值;检测子模块,被配置为检测所述第一获取子模块获取的所述光强值是否先变小再变大;确定子模块,被配置为当所述检测子模块检测的结果为所述光强值先变小再变大时,确定射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,满足所述预设变化规则。
- 根据权利要求6所述的装置,其特征在于,所述第一识别模块,包括:第二获取子模块,被配置为获取所述至少一个环境光传感器依次被遮挡的顺序;识别子模块,被配置为按照所述第二获取子模块获取的所述顺序将所述至少一个环境光传感器的位置识别为所述操作手势。
- 根据权利要求6至8任一所述的装置,其特征在于,所述装置,还包括:第一获取模块,被配置为从所述至少一个环境光传感器中,获取各个环境光传感器的光强值的变化时长;计算模块,被配置为计算所述第一获取模块获取的各个变化时长的平均值;第二确定模块,被配置为根据所述计算模块计算出的所述平均值确定所述手势的操作速度;第三确定模块,被配置为根据所述第二确定模块确定的所述操作速度确定对所述手势的响应方式。
- 根据权利要求6所述的装置,其特征在于,所述装置,还包括:第二获取模块,被配置为从所述至少一个环境光传感器中,获取各个环境光传感器的光强值变化的最小值;第二检测模块,被配置为检测所述第二获取模块获取的所述各个最小值是否在同一时段内保持不变;第二识别模块,被配置为当所述第二检测模块检测的结果为所述各个最小值在所述同一时段内保持不变时,识别出所述用户的遮挡手势。
- 一种手势识别装置,其特征在于,用于包含触摸屏的终端中,所述触摸屏中分布有环境光传感器,所述装置包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于所述触摸屏的触摸操作时,检测所述至少一个环境光传感器是否满足预设变化规则,所述预设变化规则为射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;当所述至少一个环境光传感器满足所述预设变化规则时,确定所述至少一个环境光传感器的位置;根据所述至少一个环境光传感器的位置识别用户的操作手势。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
RU2017140024A RU2690202C2 (ru) | 2016-01-19 | 2016-09-30 | Способ и устройство для распознавания жеста |
JP2016569698A JP6533535B2 (ja) | 2016-01-19 | 2016-09-30 | ジェスチャー識別方法、装置、プログラム及び記録媒体 |
KR1020177031107A KR102045232B1 (ko) | 2016-01-19 | 2016-09-30 | 제스처 식별 방법, 장치, 프로그램 및 기록매체 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610035203.3A CN105511631B (zh) | 2016-01-19 | 2016-01-19 | 手势识别方法及装置 |
CN201610035203.3 | 2016-01-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017124773A1 true WO2017124773A1 (zh) | 2017-07-27 |
Family
ID=55719681
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/100994 WO2017124773A1 (zh) | 2016-01-19 | 2016-09-30 | 手势识别方法及装置 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20170205962A1 (zh) |
EP (1) | EP3196736A1 (zh) |
JP (1) | JP6533535B2 (zh) |
KR (1) | KR102045232B1 (zh) |
CN (1) | CN105511631B (zh) |
RU (1) | RU2690202C2 (zh) |
WO (1) | WO2017124773A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11045736B2 (en) | 2018-05-02 | 2021-06-29 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
US11103784B2 (en) | 2018-05-02 | 2021-08-31 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
US11484783B2 (en) | 2018-05-02 | 2022-11-01 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105511631B (zh) * | 2016-01-19 | 2018-08-07 | 北京小米移动软件有限公司 | 手势识别方法及装置 |
WO2017147869A1 (zh) * | 2016-03-03 | 2017-09-08 | 邱琦 | 光感式手势识别方法 |
CN106095292A (zh) * | 2016-06-03 | 2016-11-09 | 上海与德通讯技术有限公司 | 终端设备及其操作方法 |
JP6838434B2 (ja) | 2017-03-13 | 2021-03-03 | オムロン株式会社 | 環境センサ |
CN109597405A (zh) * | 2017-09-30 | 2019-04-09 | 阿里巴巴集团控股有限公司 | 控制机器人移动的方法及机器人 |
GB2572978B (en) * | 2018-04-18 | 2022-01-26 | Ge Aviat Systems Ltd | Method and apparatus for a display module control system |
KR20200055202A (ko) * | 2018-11-12 | 2020-05-21 | 삼성전자주식회사 | 제스처에 의해 트리거 되는 음성 인식 서비스를 제공하는 전자 장치 및 그 동작 방법 |
CN109558035A (zh) * | 2018-11-27 | 2019-04-02 | 英华达(上海)科技有限公司 | 基于光线传感器的输入方法、终端设备和存储介质 |
CN110046585A (zh) * | 2019-04-19 | 2019-07-23 | 西北工业大学 | 一种基于环境光的手势识别方法 |
CN110764611A (zh) * | 2019-09-30 | 2020-02-07 | 深圳宝龙达信创科技股份有限公司 | 一种手势识别模块和笔记本 |
CN112710388B (zh) * | 2019-10-24 | 2022-07-01 | 北京小米移动软件有限公司 | 环境光检测方法、环境光检测装置、终端设备及存储介质 |
CN111623392A (zh) * | 2020-04-13 | 2020-09-04 | 华帝股份有限公司 | 一种带有手势识别组件的烟机及其控制方法 |
CN111596759A (zh) * | 2020-04-29 | 2020-08-28 | 维沃移动通信有限公司 | 操作手势识别方法、装置、设备及介质 |
CN112019978B (zh) * | 2020-08-06 | 2022-04-26 | 安徽华米信息科技有限公司 | 一种真无线立体声tws耳机的场景切换方法、装置及耳机 |
CN112433611A (zh) * | 2020-11-24 | 2021-03-02 | 珠海格力电器股份有限公司 | 一种终端设备的控制方法以及装置 |
CN112947753B (zh) * | 2021-02-19 | 2023-08-08 | 歌尔科技有限公司 | 可穿戴设备及其控制方法、可读存储介质 |
CN114020382A (zh) * | 2021-10-29 | 2022-02-08 | 杭州逗酷软件科技有限公司 | 一种执行方法、电子设备及计算机存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012065516A1 (zh) * | 2010-11-15 | 2012-05-24 | 惠州Tcl移动通信有限公司 | 一种通过手势识别实现照相机快门功能的方法及手机装置 |
CN103713735A (zh) * | 2012-09-29 | 2014-04-09 | 华为技术有限公司 | 一种使用非接触式手势控制终端设备的方法和装置 |
CN104484032A (zh) * | 2013-07-01 | 2015-04-01 | 黑莓有限公司 | 使用环境光传感器的手势检测 |
CN105511631A (zh) * | 2016-01-19 | 2016-04-20 | 北京小米移动软件有限公司 | 手势识别方法及装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4721159B2 (ja) * | 2005-03-28 | 2011-07-13 | ミネベア株式会社 | 面状照明装置 |
CN100504779C (zh) * | 2006-06-30 | 2009-06-24 | 联想(北京)有限公司 | 一种加速bios运行的方法 |
JP2010108081A (ja) * | 2008-10-28 | 2010-05-13 | Sharp Corp | メニュー表示装置、メニュー表示装置の制御方法、およびメニュー表示プログラム |
US20110242440A1 (en) * | 2009-01-20 | 2011-10-06 | Mikihiro Noma | Liquid crystal display device provided with light intensity sensor |
TWI590130B (zh) * | 2010-07-09 | 2017-07-01 | 群邁通訊股份有限公司 | 可擕式電子裝置及其解鎖/翻頁方法 |
CN104137027B (zh) * | 2011-10-10 | 2018-04-17 | 因维萨热技术公司 | 在空间和时间上的事件捕捉 |
KR101880998B1 (ko) * | 2011-10-14 | 2018-07-24 | 삼성전자주식회사 | 이벤트 기반 비전 센서를 이용한 동작 인식 장치 및 방법 |
WO2013114507A1 (ja) * | 2012-02-03 | 2013-08-08 | 日本電気株式会社 | 電磁波伝達シート、及び、電磁波伝送装置 |
US20150000247A1 (en) * | 2013-07-01 | 2015-01-01 | General Electric Company | System and method for detecting airfoil clash within a compressor |
EP2821889B1 (en) * | 2013-07-01 | 2018-09-05 | BlackBerry Limited | Performance control of ambient light sensors |
KR102104442B1 (ko) * | 2013-07-22 | 2020-04-24 | 엘지전자 주식회사 | 세탁물 처리기기 |
US9465448B2 (en) * | 2013-07-24 | 2016-10-11 | Blackberry Limited | Backlight for touchless gesture detection |
-
2016
- 2016-01-19 CN CN201610035203.3A patent/CN105511631B/zh active Active
- 2016-09-30 KR KR1020177031107A patent/KR102045232B1/ko active IP Right Grant
- 2016-09-30 RU RU2017140024A patent/RU2690202C2/ru active
- 2016-09-30 WO PCT/CN2016/100994 patent/WO2017124773A1/zh active Application Filing
- 2016-09-30 JP JP2016569698A patent/JP6533535B2/ja active Active
-
2017
- 2017-01-18 EP EP17152055.4A patent/EP3196736A1/en not_active Withdrawn
- 2017-01-18 US US15/409,017 patent/US20170205962A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012065516A1 (zh) * | 2010-11-15 | 2012-05-24 | 惠州Tcl移动通信有限公司 | 一种通过手势识别实现照相机快门功能的方法及手机装置 |
CN103713735A (zh) * | 2012-09-29 | 2014-04-09 | 华为技术有限公司 | 一种使用非接触式手势控制终端设备的方法和装置 |
CN104484032A (zh) * | 2013-07-01 | 2015-04-01 | 黑莓有限公司 | 使用环境光传感器的手势检测 |
CN105511631A (zh) * | 2016-01-19 | 2016-04-20 | 北京小米移动软件有限公司 | 手势识别方法及装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11045736B2 (en) | 2018-05-02 | 2021-06-29 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
US11103784B2 (en) | 2018-05-02 | 2021-08-31 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
US11484783B2 (en) | 2018-05-02 | 2022-11-01 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
US11673043B2 (en) | 2018-05-02 | 2023-06-13 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
Also Published As
Publication number | Publication date |
---|---|
RU2017140024A (ru) | 2019-05-17 |
RU2017140024A3 (zh) | 2019-05-17 |
CN105511631B (zh) | 2018-08-07 |
RU2690202C2 (ru) | 2019-05-31 |
CN105511631A (zh) | 2016-04-20 |
EP3196736A1 (en) | 2017-07-26 |
KR102045232B1 (ko) | 2019-11-15 |
US20170205962A1 (en) | 2017-07-20 |
JP2018506086A (ja) | 2018-03-01 |
KR20170132264A (ko) | 2017-12-01 |
JP6533535B2 (ja) | 2019-06-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017124773A1 (zh) | 手势识别方法及装置 | |
WO2017071068A1 (zh) | 应用程序切换方法、装置及设备 | |
JP6199510B2 (ja) | ディスプレイモードを切り換える方法及び装置 | |
WO2017088579A1 (zh) | 指纹识别方法及装置 | |
JP6559769B2 (ja) | ジェスチャーの識別方法及び装置 | |
US20170344192A1 (en) | Method and device for playing live videos | |
WO2017059638A1 (zh) | 指纹识别方法及装置 | |
US20170060320A1 (en) | Method for controlling a mobile terminal using a side touch panel | |
WO2017177597A1 (zh) | 实体按键组件、终端、触控响应方法及装置 | |
CN111695382B (zh) | 指纹采集区域确定方法和指纹采集区域确定装置 | |
WO2017071050A1 (zh) | 具有触摸屏的终端的防误触方法及装置 | |
CN108255369B (zh) | 屏内指纹图标的显示方法、装置及计算机可读存储介质 | |
RU2683979C2 (ru) | Способ и устройство для обнаружения давления | |
EP3232301B1 (en) | Mobile terminal and virtual key processing method | |
US10061497B2 (en) | Method, device and storage medium for interchanging icon positions | |
WO2017201891A1 (zh) | 控制触控屏状态的方法及装置、电子设备 | |
WO2016206295A1 (zh) | 字符确定方法及装置 | |
US20190370584A1 (en) | Collecting fingerprints | |
WO2016095395A1 (zh) | 激活移动终端的操作状态的方法及装置 | |
US20170336962A1 (en) | Gesture response method and device | |
US20160195992A1 (en) | Mobile terminal and method for processing signals generated from touching virtual keys | |
CN107329604B (zh) | 一种移动终端控制方法及装置 | |
US20160173668A1 (en) | Method and device for activating an operating state of a mobile terminal | |
CN111736718A (zh) | 触摸屏控制方法及装置 | |
WO2023230829A1 (zh) | 一种触控检测方法、装置及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016569698 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16886033 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20177031107 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017140024 Country of ref document: RU |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16886033 Country of ref document: EP Kind code of ref document: A1 |