WO2017124773A1 - 手势识别方法及装置 - Google Patents

手势识别方法及装置 Download PDF

Info

Publication number
WO2017124773A1
WO2017124773A1 PCT/CN2016/100994 CN2016100994W WO2017124773A1 WO 2017124773 A1 WO2017124773 A1 WO 2017124773A1 CN 2016100994 W CN2016100994 W CN 2016100994W WO 2017124773 A1 WO2017124773 A1 WO 2017124773A1
Authority
WO
WIPO (PCT)
Prior art keywords
ambient light
light sensor
gesture
module
state
Prior art date
Application number
PCT/CN2016/100994
Other languages
English (en)
French (fr)
Inventor
李国盛
孙伟
江忠胜
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Priority to RU2017140024A priority Critical patent/RU2690202C2/ru
Priority to JP2016569698A priority patent/JP6533535B2/ja
Priority to KR1020177031107A priority patent/KR102045232B1/ko
Publication of WO2017124773A1 publication Critical patent/WO2017124773A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Definitions

  • the present disclosure relates to the field of display technologies, and in particular, to a gesture recognition method and apparatus.
  • touch screens have more and more functions, such as gesture recognition.
  • the touch screen first determines a touch position at which the user's finger touches the touch screen, and then recognizes a gesture made by the user according to the touch position.
  • the present disclosure provides a gesture recognition method and apparatus.
  • a method for recognizing a gesture is provided in a terminal that includes a touch screen, wherein the ambient light sensor is distributed in the touch screen, the method includes:
  • the preset change rule is an incident ambient light
  • the light of the sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state;
  • the user's operational gesture is identified based on the location of the at least one ambient light sensor.
  • detecting whether at least one ambient light sensor meets a preset change rule including:
  • the light intensity value first becomes smaller and then becomes larger, it is determined that the light incident on the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the preset change rule is satisfied.
  • the user's operation gesture is identified according to the location of the at least one ambient light sensor, including:
  • the position of the at least one ambient light sensor is identified as an operational gesture in sequence.
  • the method further includes:
  • the method further includes:
  • the user's occlusion gesture is recognized when the respective minimum values remain unchanged for the same time period.
  • a gesture recognition apparatus for use in a terminal including a touch screen, wherein the ambient light sensor is distributed in the touch screen, and the apparatus includes:
  • the first detecting module is configured to detect whether the at least one ambient light sensor satisfies a preset change rule when there is a ray that is incident on the at least one ambient light sensor is blocked, and a touch operation on the touch screen is not detected, the pre-detection
  • the change rule is that the light incident on the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state;
  • a first determining module configured to determine a location of the at least one ambient light sensor when the result of the first detecting module detecting that the at least one ambient light sensor meets the preset change rule
  • the first identification module is configured to identify an operation gesture of the user according to the location of the at least one ambient light sensor determined by the first determining module.
  • the first detecting module includes:
  • a first acquiring submodule configured to acquire, for each ambient light sensor, a light intensity value measured by the ambient light sensor
  • the detecting submodule is configured to detect whether the light intensity value acquired by the first acquiring submodule is first reduced and then increased;
  • the determining submodule is configured to: when the detection result of the detecting submodule is that the light intensity value first becomes smaller and then becomes larger, determining that the light incident on the ambient light sensor is first switched from the unoccluded state to the occluded state, and then occluded The state is switched to the unoccluded state, and the preset change rule is satisfied.
  • the first identification module includes:
  • a second acquiring submodule configured to acquire an order in which at least one ambient light sensor is sequentially occluded
  • the identification sub-module is configured to identify the location of the at least one ambient light sensor as an operational gesture in an order acquired by the second acquisition sub-module.
  • the device further includes:
  • the first obtaining module is configured to acquire, from the at least one ambient light sensor, a change duration of the light intensity value of each ambient light sensor;
  • a calculation module configured to calculate an average value of each change duration acquired by the first acquisition module
  • a second determining module configured to determine an operating speed of the gesture according to an average value calculated by the computing module
  • the third determining module is configured to determine a response manner to the gesture according to the operating speed determined by the second determining module.
  • the device further includes:
  • a second acquiring module configured to acquire, from the at least one ambient light sensor, a minimum value of a change in the light intensity value of each ambient light sensor
  • a second detecting module configured to detect whether each minimum value acquired by the second acquiring module is maintained in the same time period constant
  • the second identification module is configured to recognize the occlusion gesture of the user when the result detected by the second detection module is that the respective minimum values remain unchanged for the same period of time.
  • a gesture recognition apparatus for use in a terminal including a touch screen, wherein the ambient light sensor is distributed in the touch screen, the apparatus includes:
  • a memory for storing processor executable instructions
  • processor is configured to:
  • the preset change rule is an incident ambient light
  • the light of the sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state;
  • the user's operational gesture is identified based on the location of the at least one ambient light sensor.
  • the terminal cannot realize the problem of recognizing the operation gesture made by the user, and the method of increasing the gesture recognition and improving the flexibility of the gesture recognition is achieved.
  • the operation gesture can be extended, which solves the problem that the user does not have a small operation gesture when the touch screen is not touched, and the touch screen responds less to the operation gesture.
  • FIG. 1 is a flowchart of a gesture recognition method according to an exemplary embodiment.
  • FIG. 2A is a flowchart of a gesture recognition method according to another exemplary embodiment.
  • FIG. 2B is a scene diagram of determining an operation position of an operation gesture, according to an exemplary embodiment.
  • FIG. 2C is a scene diagram illustrating an operational position of determining an operation gesture, according to another exemplary embodiment.
  • FIG. 2D is a scene diagram of a recognition operation gesture according to an exemplary embodiment.
  • FIG. 2E is a scene diagram illustrating a recognition operation gesture, according to another exemplary embodiment.
  • FIG. 2F is a scene diagram of a recognition operation gesture according to another exemplary embodiment.
  • FIG. 2G is a flowchart illustrating a method of identifying a speed of an operation gesture, according to an exemplary embodiment.
  • FIG. 2H is a flowchart of a method for identifying an occlusion gesture according to an exemplary embodiment.
  • FIG. 3 is a block diagram of a gesture recognition apparatus, according to an exemplary embodiment.
  • FIG. 4 is a block diagram of a gesture recognition apparatus, according to an exemplary embodiment.
  • FIG. 5 is a block diagram of an apparatus for gesture recognition, according to an exemplary embodiment.
  • FIG. 1 is a flowchart of a gesture recognition method, which is applied to a terminal including a touch screen, in which an ambient light sensor is distributed, as shown in FIG. 1 , which is illustrated in FIG. 1 , according to an exemplary embodiment.
  • the method includes the following steps.
  • step 101 when there is a ray that is incident on the at least one ambient light sensor is blocked, and a touch operation on the touch screen is not detected, detecting whether the at least one ambient light sensor satisfies a preset change rule, the preset change rule
  • the light that is incident on the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state.
  • step 102 the position of the at least one ambient light sensor is determined when the at least one ambient light sensor satisfies a preset change rule.
  • step 103 the user's operational gesture is identified based on the location of the at least one ambient light sensor.
  • the gesture recognition method is configured to detect each incident ambient light sensor by detecting when a light incident on at least one ambient light sensor is blocked and a touch operation on the touch screen is not detected.
  • the light is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the user's operation gesture is recognized according to the position of the at least one ambient light sensor, so that the user does not need to perform a touch on the touch screen.
  • the terminal can recognize the operation gesture made by the user, and solve the problem that when the user is inconvenient to perform the touch operation on the touch screen, the terminal cannot realize the operation gesture of recognizing the user, and the method for increasing the gesture recognition is improved.
  • FIG. 2A is a flowchart of a gesture recognition method applied to a terminal including a touch screen in which an ambient light sensor is distributed, as shown in FIG. 2A, according to another exemplary embodiment.
  • the identification method includes the following steps.
  • step 201 when there is a ray that is incident on the at least one ambient light sensor is blocked, and a touch operation on the touch screen is not detected, detecting whether the at least one ambient light sensor satisfies a preset change rule, the preset change
  • the rule is that the light entering the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state.
  • the user's finger needs to touch the touch screen, and the terminal can recognize the user's operation gesture, that is, the operation gesture needs to be applied to the touch screen.
  • the user's finger is dirty and is inconvenient to touch the touch screen, the user needs to wipe the finger first, and then perform a touch operation on the touch screen, so that the user cannot operate the terminal in time, or the user directly performs a touch operation on the touch screen. causes the touch screen to be contaminated.
  • the embodiment provides a method for recognizing a user's operation gesture without directly touching the touch screen with a finger, and the specific method is as follows.
  • the ambient light sensor distributed in the touch screen can measure the light intensity of the light incident on the ambient light sensor. When there is an object blocking the touch screen, the measured light intensity value of the occluded ambient light sensor will become smaller. Therefore, the ambient light sensor can determine whether it is occluded according to the light intensity value, and report the occlusion event to the terminal when it is occluded. .
  • the terminal After receiving the occlusion event reported by the at least one ambient light sensor, the terminal detects whether there is a touch operation on the touch screen. If there is a touch operation on the touch screen, the occlusion event is an operation gesture applied to the touch screen; if there is no touch operation on the touch screen, It is indicated that the occlusion event is an operation gesture that does not act on the touch screen.
  • the operation gesture is a sliding gesture.
  • the light incident on the ambient light sensor is switched from the unoccluded state to the occluded state, and then the occluded state.
  • Switching to the unoccluded state it is possible to identify whether the operation gesture made by the user is a swipe gesture by detecting whether the at least one ambient light sensor satisfies the preset change rule.
  • the method for detecting whether at least one ambient light sensor meets a preset change rule includes:
  • For each ambient light sensor obtain the light intensity value measured by the ambient light sensor; detect whether the light intensity value first becomes smaller and then larger; when the light intensity value first becomes smaller and then becomes larger, determine the light incident into the ambient light sensor First, the unoccluded state is switched to the occluded state, and then the occluded state is switched to the unoccluded state, and the preset change rule is satisfied.
  • the ambient light sensor When the light incident on the ambient light sensor is not blocked, the ambient light sensor measures a greater light intensity value; when the light incident on the ambient light sensor is blocked, the ambient light sensor measures a smaller light intensity value,
  • the change of the ambient light sensor can be determined according to the change of the light intensity value. That is, when the light intensity value changes from large to small, it is determined that the light incident on the ambient light sensor is switched from the unoccluded state to the occluded state; when the light intensity value is further changed from small to large, the light incident on the ambient light sensor is determined to be The occluded state is switched to the unoccluded state.
  • step 202 the location of the at least one ambient light sensor is determined when the at least one ambient light sensor satisfies the preset change rule.
  • the position of the ambient light sensor at the center position of each shaded portion may be taken as the position of the ambient light sensor that satisfies the preset change rule at the current time, that is, the operation position of the current time operation gesture.
  • the user makes an operation gesture 2, and a shadow portion 3 is formed on the terminal 1.
  • the terminal 1 acquires the position of the at least one ambient light sensor corresponding to the shaded portion 3, and calculates the at least The center point of an ambient light sensor position is 4, and the center point 4 is used as the operation position of the operation gesture at that moment.
  • the user makes an operation gesture 5, and forms a shaded portion 6 and a shaded portion 7 on the terminal 1, and the terminal 1 acquires the position of the at least one ambient light sensor corresponding to the shaded portion 6, And calculating a center point of the at least one ambient light sensor position is 8; acquiring a position of the at least one ambient light sensor corresponding to the shaded portion 7, and calculating a center point of the at least one ambient light sensor position is 9, the center point is 8 and the center point 9 serve as operational positions of the operation gesture at that moment.
  • the method for determining the shadow portion of the terminal may be a region composed of at least one ambient light sensor whose light intensity values are equal at the same time and the positions are consecutive.
  • the present embodiment does not limit the method for determining the shadow portion.
  • the terminal determines an operation position of the operation gesture at each moment according to the position of the at least one ambient light sensor, and the operation gesture of the user can be recognized according to the operation position of each moment.
  • step 203 an order in which at least one ambient light sensor is sequentially occluded is acquired.
  • the terminal may record the time when the occlusion event is received when the occlusion event is received. Therefore, when the user's operation gesture is generated by a continuous action, such as: a sliding operation, the terminal The time at which the at least one ambient light sensor is sequentially occluded may be acquired, and the order in which the at least one ambient light sensor is sequentially occluded is determined according to the acquired time.
  • step 204 the position of the at least one ambient light sensor is identified as an operational gesture in sequence.
  • an operation trajectory of the operation gesture when the touch screen is not touched may be obtained; according to the order in which at least one ambient light sensor determined in step 203 is sequentially occluded, an operation gesture may be obtained.
  • the operation direction, according to the operation track and the operation direction, the terminal can recognize the operation gesture made by the user.
  • the first time determines the operation position 11
  • the second time determines the operation position 12
  • the third time determines the operation position 13, according to the determined position and the order of the determined position.
  • the user's operation gesture is recognized as a swipe gesture to the right.
  • the terminal may further set a first angle threshold and a second angle threshold to determine an operation direction of the operation gesture, where the second angle threshold is greater than the first angle threshold.
  • the terminal selects the operation position determined by any two moments. If the angle between the connection line and the horizontal direction of the two operation positions is smaller than the first angle threshold, the operation gesture is recognized as a leftward or rightward sliding operation; If the angle between the connection line and the horizontal direction is greater than the first angle threshold is less than the second angle threshold, the operation gesture is recognized as an oblique sliding operation; if the connection between the two operation positions is in the horizontal direction When the second angle threshold is greater than, the operation gesture is recognized as an upward or downward sliding operation.
  • the terminal sets a first angle threshold value of 30 degrees in the terminal 1 and a second angle threshold value of 60 degrees, and determines an operation position 12 at an arbitrarily selected first moment, and an arbitrarily selected second.
  • the operation position 14 is determined at the moment, and the angle between the connection line of the two operation positions and the horizontal direction is 45 degrees, which is greater than the first angle threshold value is smaller than the second angle threshold value. Therefore, according to the determined position, the order of the determined position, and the positional connection The angle between the line and the horizontal direction, the terminal recognizes that the user's operation gesture is a swipe gesture to the upper right.
  • the terminal can also calculate the level of the angle between the connection line and the horizontal direction of the two operation positions of each group of adjacent moments.
  • the average value is compared with the first angle threshold and the second angle threshold. If the average value is smaller than the first angle threshold, the sliding operation is recognized as left or right; if the average value is greater than the first angle threshold, the value is smaller than the first angle threshold.
  • the second angle threshold is identified as a sliding operation in an oblique direction; if the average value is greater than the second angle threshold, it is recognized as an upward or downward sliding operation.
  • the terminal determines at least two operation positions at the same time, for each operation position, the operation position closest to itself at an adjacent time and itself is determined as an operation combination, and the identification is determined according to the determined operation combinations.
  • the user's gesture of operation is determined according to the determined operation combinations.
  • the first time determines the operation positions 11 and 15, the second time determines the operation positions 12 and 16, and the third time determines the operation positions 13 and 17, and the terminal determines the operation.
  • the positions 11, 12, and 13 are an operation combination; the operation positions 15, 16, and 17 are determined as an operation combination, and the operation gesture of the user is identified according to the determined combination of the two operations.
  • FIG. 2G is a flowchart illustrating a method for recognizing the speed of the operation gesture according to an exemplary embodiment.
  • the method for recognizing the gesture speed is applied to a terminal including a touch screen.
  • the ambient light sensor is distributed in the touch screen.
  • the method for identifying the gesture speed includes the following steps.
  • step 205 a change duration of the light intensity values of the respective ambient light sensors is acquired from the at least one ambient light sensor.
  • the terminal For each of the ambient light sensors, in a possible implementation manner, the terminal records the time when the light intensity value of the ambient light sensor starts to change and stops changing. When the ambient light sensor satisfies the preset change rule, the terminal may record according to the record. The time is calculated as the duration of change of the light intensity value of the ambient light sensor. In another possible implementation manner, the terminal may acquire a change duration of the light intensity value of the ambient light sensor before detecting that the ambient light sensor meets the preset change rule, and when detecting that the ambient light sensor meets the preset change rule , directly read the change duration obtained before. This embodiment does not limit the timing at which the terminal obtains the change duration of the light intensity value.
  • step 206 an average of the respective varying durations is calculated.
  • the terminal calculates an average value of the obtained change durations, or the terminal selects at least one change duration from the acquired change durations, and calculates an average value of the at least one change duration.
  • step 207 the operating speed of the gesture is determined based on the average.
  • a time threshold is preset in the terminal, and the operation speed of the gesture can be determined by comparing the average value with the time threshold.
  • the time threshold may be one or more.
  • two time thresholds are set in the terminal, namely: a first time threshold and a second time threshold, and the second time threshold is less than the second time threshold. If the average value is greater than the first time threshold, the terminal determines that the gesture is a slow motion gesture; if the average value is less than the second time threshold, the terminal determines that the gesture is a fast gesture; if the average value is greater than the second time threshold and is less than the first time threshold, Then the terminal determines that the gesture is a medium speed gesture.
  • a response to the gesture is determined based on the speed of operation.
  • the manner in which the terminal responds to the operation gesture is increased.
  • the response mode corresponding to the fast right-slip gesture is video fast forward; the response mode corresponding to the slow right-slip gesture is to jump to the next video.
  • the embodiment further provides a method for identifying an occlusion gesture made by a user, and a flowchart for identifying an occlusion gesture, as shown in FIG. 2H, the method includes the following steps.
  • step 209 a minimum value of the change in the light intensity value of each of the ambient light sensors is acquired from the at least one ambient light sensor.
  • the terminal determines that the at least one ambient light sensor meets the preset change rule, the terminal acquires a minimum value of the light intensity value of each ambient light sensor during the change process, where the minimum value is when the incident light is blocked, and the ambient light sensor The measured light intensity value.
  • the minimum values of the light intensity values measured by the respective ambient light sensors are the same.
  • step 210 it is detected whether the respective minimum values remain unchanged for the same period of time.
  • the terminal detects that there is a minimum value of at least one light intensity value at the same time, it continues to detect whether the at least one minimum value remains unchanged for the same time period.
  • step 211 the user's occlusion gesture is recognized when the respective minimum values remain unchanged for the same time period.
  • the terminal may recognize that the user's operation gesture is an occlusion gesture when each minimum value remains unchanged for the same period of time.
  • the terminal expands the response manner to the operation gesture by recognizing the occlusion gesture of the user. For example, when the terminal recognizes that the user performs the occlusion gesture, the corresponding response manner is to pause playing the video or music, or when the terminal recognizes the user When an occlusion gesture is made, the corresponding response method is to click on an application.
  • a first predetermined time may be set in the terminal, and when the duration of each of the minimum values is greater than or equal to the first predetermined time, the operation gesture of the user is identified as the first type of occlusion gesture, or A second predetermined time is set in the terminal, and when the duration of each of the minimum values is less than or equal to the second predetermined time, the user's operation gesture is recognized as the second type of occlusion gesture.
  • the terminal sets different response modes for different types of occlusion gestures.
  • the gesture recognition method is configured to detect each incident ambient light sensor by detecting when a light incident on at least one ambient light sensor is blocked and a touch operation on the touch screen is not detected.
  • the light is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the user's operation gesture is recognized according to the position of the at least one ambient light sensor, so that the user does not need to perform a touch on the touch screen.
  • the terminal can recognize the operation gesture made by the user, and solve the problem that when the user is inconvenient to perform the touch operation on the touch screen, the terminal cannot realize the operation gesture of recognizing the user, and the method for increasing the gesture recognition is improved.
  • the operation gesture can be extended, which solves the problem that the user does not have a small operation gesture when the touch screen is not touched, and the touch screen responds less to the operation gesture.
  • FIG. 3 is a block diagram of a gesture recognition apparatus applied to a terminal including a touch screen in which an ambient light sensor is distributed, as shown in FIG. 3, the gesture recognition apparatus is illustrated, according to an exemplary embodiment.
  • the first detecting module 310, the first determining module 320, and the first identifying module 330 are included.
  • the first detecting module 310 is configured to block when light entering the at least one ambient light sensor is blocked, and When the touch operation on the touch screen is not detected, detecting whether the at least one ambient light sensor satisfies a preset change rule, wherein the preset change rule is that the light incident on the ambient light sensor is first switched from the unoccluded state to the occluded state. Switching from the occluded state to the unoccluded state;
  • the first determining module 320 is configured to determine a location of the at least one ambient light sensor when the result of the first detecting module 310 detecting that the at least one ambient light sensor meets the preset change rule;
  • the first identification module 330 is configured to recognize an operation gesture of the user according to the location of the at least one ambient light sensor determined by the first determining module 320.
  • the gesture recognition device detects each incident ambient light sensor by detecting when a light incident on at least one ambient light sensor is blocked and a touch operation on the touch screen is not detected.
  • the light is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the user's operation gesture is recognized according to the position of the at least one ambient light sensor, so that the user does not need to perform a touch on the touch screen.
  • the terminal can recognize the operation gesture made by the user, and solve the problem that when the user is inconvenient to perform the touch operation on the touch screen, the terminal cannot realize the operation gesture of recognizing the user, and the method for increasing the gesture recognition is improved.
  • the effect of the flexibility of gesture recognition is provided by the present disclosure.
  • FIG. 4 is a block diagram of a gesture recognition apparatus, which is applied to a terminal including a touch screen, in which an ambient light sensor is distributed, as shown in FIG. 4, the gesture recognition apparatus is illustrated, according to an exemplary embodiment.
  • the first detecting module 410, the first determining module 420, and the first identifying module 430 are included.
  • the first detecting module 410 is configured to detect whether the at least one ambient light sensor satisfies a preset change rule when there is a ray that is incident on the at least one ambient light sensor is blocked, and a touch operation on the touch screen is not detected,
  • the preset change rule is that the light incident on the ambient light sensor is first switched from the unoccluded state to the occluded state, and then switched from the occluded state to the unoccluded state;
  • the first determining module 420 is configured to determine a location of the at least one ambient light sensor when the result of the first detecting module 410 detecting that the at least one ambient light sensor meets the preset change rule;
  • the first identification module 430 is configured to recognize an operation gesture of the user according to the location of the at least one ambient light sensor determined by the first determining module 420.
  • the first detecting module 410 includes: a first obtaining submodule 411, a detecting submodule 412, and a determining submodule 413.
  • the first obtaining submodule 411 is configured to acquire, for each ambient light sensor, a light intensity value measured by the ambient light sensor;
  • the detecting sub-module 412 is configured to detect whether the light intensity value acquired by the first acquiring sub-module 411 first becomes smaller and then becomes larger;
  • the determining sub-module 413 is configured to determine that the light incident on the ambient light sensor is first switched from the unoccluded state to the occluded state when the detected result of the detecting sub-module 412 is that the light intensity value first becomes smaller and then becomes larger.
  • the occluded state is switched to the unoccluded state, and the preset change rule is satisfied.
  • the first identification module 430 includes: a second obtaining submodule 431 and an identifying submodule 432.
  • the second obtaining submodule 431 is configured to acquire an order in which at least one ambient light sensor is sequentially occluded;
  • the identification sub-module 432 is configured to recognize the position of the at least one ambient light sensor as an operation gesture in the order acquired by the second acquisition sub-module 431.
  • the device further includes: a first obtaining module 440, a calculating module 450, a second determining module 460, and a third determining module 470.
  • the first obtaining module 440 is configured to acquire, from the at least one ambient light sensor, a change duration of the light intensity value of each ambient light sensor;
  • the calculation module 450 is configured to calculate an average value of each change duration acquired by the first acquisition module 440;
  • the second determining module 460 is configured to determine an operating speed of the gesture according to the average value calculated by the calculating module 450;
  • the third determining module 470 is configured to determine a response manner to the gesture according to the operating speed determined by the second determining module 460.
  • the device further includes: a second obtaining module 480, a second detecting module 490, and a second identifying module 491.
  • the second obtaining module 480 is configured to acquire, from the at least one ambient light sensor, a minimum value of a change in the light intensity value of each ambient light sensor;
  • the second detecting module 490 is configured to detect whether each of the minimum values acquired by the second obtaining module 480 remains unchanged during the same time period;
  • the second identification module 491 is configured to recognize the occlusion gesture of the user when the result detected by the second detection module 490 is that the respective minimum values remain unchanged for the same period of time.
  • the gesture recognition device detects each incident ambient light sensor by detecting when a light incident on at least one ambient light sensor is blocked and a touch operation on the touch screen is not detected.
  • the light is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state, and the user's operation gesture is recognized according to the position of the at least one ambient light sensor, so that the user does not need to perform a touch on the touch screen.
  • the terminal can recognize the operation gesture made by the user, and solve the problem that when the user is inconvenient to perform the touch operation on the touch screen, the terminal cannot realize the operation gesture of recognizing the user, and the method for increasing the gesture recognition is improved.
  • the effect of the flexibility of gesture recognition is provided by the present disclosure.
  • the operation gesture can be extended, which solves the problem that the user does not have a small operation gesture when the touch screen is not touched, and the touch screen responds less to the operation gesture.
  • An exemplary embodiment of the present disclosure provides a gesture recognition apparatus capable of implementing the gesture recognition method provided by the present disclosure, the gesture recognition apparatus being applied to a terminal including a touch screen in which an ambient light sensor is distributed, the gesture recognition apparatus
  • the method includes: a processor, a memory for storing processor executable instructions;
  • processor is configured to:
  • the preset change rule is an incident ambient light
  • the light of the sensor is switched from the unoccluded state to the occluded state, and then the occluded state is switched to the unoccluded state;
  • the user's operational gesture is identified based on the location of the at least one ambient light sensor.
  • FIG. 5 is a block diagram of an apparatus 500 for gesture recognition, according to an exemplary embodiment.
  • device 500 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a gaming console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
  • apparatus 500 can include one or more of the following components: processing component 502, memory 504, power component 506, multimedia component 508, audio component 510, input/output (I/O) interface 512, sensor component 514, And a communication component 516.
  • Processing component 502 typically controls the overall operation of device 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 502 can include one or more processors 518 to execute instructions to perform all or part of the steps described above.
  • processing component 502 can include one or more modules to facilitate interaction between component 502 and other components.
  • processing component 502 can include a multimedia module to facilitate interaction between multimedia component 508 and processing component 502.
  • Memory 504 is configured to store various types of data to support operation at device 500. Examples of such data include instructions for any application or method operating on device 500, contact data, phone book data, messages, pictures, videos, and the like.
  • the memory 504 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM Electrically erasable programmable read only memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Disk Disk or Optical Disk.
  • Power component 506 provides power to various components of device 500.
  • Power component 506 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 500.
  • the multimedia component 508 includes a screen between the device 500 and the user that provides an output interface.
  • the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the multimedia component 508 includes a front camera and/or a rear camera. When the device 500 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 510 is configured to output and/or input an audio signal.
  • audio component 510 includes a microphone (MIC) that is used when device 500 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the wind is configured to receive an external audio signal.
  • the received audio signal may be further stored in memory 504 or transmitted via communication component 516.
  • audio component 510 also includes a speaker for outputting an audio signal.
  • the I/O interface 512 provides an interface between the processing component 502 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor assembly 514 includes one or more sensors for providing device 500 with various aspects of status assessment.
  • sensor assembly 514 can detect an open/closed state of device 500, a relative positioning of components, such as the display and keypad of device 500, and sensor component 514 can also detect a change in position of one component of device 500 or device 500. The presence or absence of user contact with device 500, device 500 orientation or acceleration/deceleration, and temperature variation of device 500.
  • Sensor assembly 514 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 514 may also include an ambient light sensor for detecting the intensity of ambient light of device 500.
  • the sensor component 514 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 516 is configured to facilitate wired or wireless communication between device 500 and other devices.
  • the device 500 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
  • communication component 516 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • the communication component 516 also includes a near field communication (NFC) module to facilitate short range communication.
  • NFC near field communication
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • apparatus 500 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable A gate array
  • controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • non-transitory computer readable storage medium comprising instructions, such as a memory 504 comprising instructions executable by processor 518 of apparatus 500 to perform the above method.
  • the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.

Abstract

一种手势识别方法及装置,用于包含触摸屏的终端中,触摸屏中分布有环境光传感器,该方法包括:当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态(101);当至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置(102);根据至少一个环境光传感器的位置识别用户的操作手势(103);所述方法增加了手势识别的方式,提高了手势识别的灵活性。

Description

手势识别方法及装置
本申请基于申请号为201610035203.3、申请日为2016年01月19日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及显示技术领域,特别涉及一种手势识别方法及装置。
背景技术
随着触摸屏技术的发展,触摸屏所具有的功能也越来越多,比如手势识别功能。
相关技术中,触摸屏首先确定用户手指触摸该触摸屏的触摸位置,然后根据该触摸位置来识别用户做出的手势。
发明内容
为解决相关技术中的问题,本公开提供了一种手势识别方法及装置。
根据本公开实施例的第一方面,提供一种手势识别方法,该方法用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,该方法包括:
当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
当至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置;
根据至少一个环境光传感器的位置识别用户的操作手势。
可选的,检测至少一个环境光传感器是否满足预设变化规则,包括:
对于每个环境光传感器,获取环境光传感器测得的光强值;
检测光强值是否先变小再变大;
当光强值先变小再变大时,确定射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,满足预设变化规则。
可选的,根据至少一个环境光传感器的位置识别用户的操作手势,包括:
获取至少一个环境光传感器依次被遮挡的顺序;
按照顺序将至少一个环境光传感器的位置识别为操作手势。
可选的,该方法,还包括:
从至少一个环境光传感器中,获取各个环境光传感器的光强值的变化时长;
计算各个变化时长的平均值;
根据平均值确定手势的操作速度;
根据操作速度确定对手势的响应方式。
可选的,该方法,还包括:
从至少一个环境光传感器中,获取各个环境光传感器的光强值变化的最小值;
检测各个最小值是否在同一时段内保持不变;
当各个最小值在同一时段内保持不变时,识别出用户的遮挡手势。
根据本公开实施例的第二方面,提供一种手势识别装置,该装置用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,该装置包括:
第一检测模块,被配置为当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
第一确定模块,被配置为当第一检测模块检测的结果为至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置;
第一识别模块,被配置为根据第一确定模块确定的至少一个环境光传感器的位置识别用户的操作手势。
可选的,该第一检测模块,包括:
第一获取子模块,被配置为对于每个环境光传感器,获取环境光传感器测得的光强值;
检测子模块,被配置为检测第一获取子模块获取的光强值是否先变小再变大;
确定子模块,被配置为当检测子模块检测的结果为光强值先变小再变大时,确定射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,满足预设变化规则。
可选的,该第一识别模块,包括:
第二获取子模块,被配置为获取至少一个环境光传感器依次被遮挡的顺序;
识别子模块,被配置为按照第二获取子模块获取的顺序将至少一个环境光传感器的位置识别为操作手势。
可选的,该装置,还包括:
第一获取模块,被配置为从至少一个环境光传感器中,获取各个环境光传感器的光强值的变化时长;
计算模块,被配置为计算第一获取模块获取的各个变化时长的平均值;
第二确定模块,被配置为根据计算模块计算出的平均值确定手势的操作速度;
第三确定模块,被配置为根据第二确定模块确定的操作速度确定对手势的响应方式。
可选的,该装置,还包括:
第二获取模块,被配置为从至少一个环境光传感器中,获取各个环境光传感器的光强值变化的最小值;
第二检测模块,被配置为检测第二获取模块获取的各个最小值是否在同一时段内保持 不变;
第二识别模块,被配置为当第二检测模块检测的结果为各个最小值在同一时段内保持不变时,识别出用户的遮挡手势。
根据本公开实施例的第三方面,提供一种手势识别装置,该装置用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,该装置包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,处理器被配置为:
当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
当至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置;
根据至少一个环境光传感器的位置识别用户的操作手势。
本公开的实施例提供的技术方案可以包括以下有益效果:
通过当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,若检测到射入每个环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,则根据该至少一个环境光传感器的位置识别用户的操作手势,使得用户无需在触摸屏上执行触摸操作,终端就可以识别出用户做出的操作手势,解决了当用户不方便在触摸屏上执行触摸操作时,终端无法实现识别用户做出的操作手势的问题,达到了增加手势识别的方式,提高手势识别的灵活性的效果。
此外,通过识别操作手势的速度以及遮挡手势,可以对操作手势进行扩展,解决了用户在未接触触摸屏时做出的操作手势较少,导致触摸屏对操作手势的响应方式较少的问题,达到了增加触摸屏对操作手势的响应的方式的效果。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本公开说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
图1是根据一示例性实施例示出的一种手势识别方法的流程图。
图2A是根据另一示例性实施例示出的一种手势识别方法的流程图。
图2B是根据一示例性实施例示出的一种确定操作手势的操作位置的场景图。
图2C是根据另一示例性实施例示出的一种确定操作手势的操作位置的场景图。
图2D是根据一示例性实施例示出的一种识别操作手势的场景图。
图2E是根据另一示例性实施例示出的一种识别操作手势的场景图。
图2F是根据另一示例性实施例示出的一种识别操作手势的场景图。
图2G是根据一示例性实施例示出的一种操作手势的速度的识别方法的流程图。
图2H是根据一示例性实施例示出的一种遮挡手势的识别方法的流程图。
图3是根据一示例性实施例示出的一种手势识别装置的框图。
图4是根据一示例性实施例示出的一种手势识别装置的框图。
图5是根据一示例性实施例示出的一种用于手势识别的装置的框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
图1是根据一示例性实施例示出的一种手势识别方法的流程图,该手势识别方法应用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,如图1所示,该手势识别方法包括以下步骤。
在步骤101中,当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态。
在步骤102中,当至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置。
在步骤103中,根据至少一个环境光传感器的位置识别用户的操作手势。
综上所述,本公开提供的手势识别方法,通过当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,若检测到射入每个环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,则根据该至少一个环境光传感器的位置识别用户的操作手势,使得用户无需在触摸屏上执行触摸操作,终端就可以识别出用户做出的操作手势,解决了当用户不方便在触摸屏上执行触摸操作时,终端无法实现识别用户做出的操作手势的问题,达到了增加手势识别的方式,提高手势识别的灵活性的效果。
图2A是根据另一示例性实施例示出的一种手势识别方法的流程图,该手势识别方法应用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,如图2A所示,该手势识别方法包括如下步骤。
在步骤201中,当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化 规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态。
相关技术中,用户的手指需要接触触摸屏,终端才能给识别出用户的操作手势,即,操作手势需要作用于触摸屏上。而在用户的手指比较脏等不方便接触触摸屏的情况下时,用户需要先擦干净手指,再在触摸屏上执行触摸操作,导致用户不能及时操作终端,或者,用户直接在触摸屏上执行触摸操作,导致触摸屏被污染。为了解决上述问题,本实施例提供了一种无需手指直接接触触摸屏,就可以识别出用户的操作手势的方法,具体方法如下。
触摸屏中分布的环境光传感器可以测得射入环境光传感器的光线的光强值。当存在物体遮挡触摸屏时,被遮挡的环境光传感器测得的光强值会变小,因此,环境光传感器可以根据光强值确定自身是否被遮挡,并在自身被遮挡时向终端上报遮挡事件。
终端在接收到至少一个环境光传感器上报的遮挡事件后,检测触摸屏上是否存在触摸操作,若触摸屏上存在触摸操作,说明该遮挡事件是作用于触摸屏的操作手势;若触摸屏上不存在触摸操作,说明该遮挡事件是未作用于触摸屏的操作手势。
假设操作手势是滑动手势,对于在滑动手势的操作过程中被遮挡的各个环境光传感器来说,射入环境光传感器的光线都是由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,因此,可以通过检测该至少一个环境光传感器是否满足上述预设变化规则,来识别用户做出的操作手势是否为滑动手势。
其中,检测至少一个环境光传感器是否满足预设变化规则的方法,包括:
对于每个环境光传感器,获取环境光传感器测得的光强值;检测光强值是否先变小再变大;当光强值先变小再变大时,确定射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,满足预设变化规则。
当射入环境光传感器的光线未被遮挡时,环境光传感器测得的光强值较大;当射入环境光传感器的光线被遮挡时,环境光传感器测得的光强值较小,因此,可以根据光强值的变化确定环境光传感器的变化。即,当光强值由大变小时,确定射入环境光传感器的光线由未被遮挡状态切换为被遮挡状态;当光强值再由小变大时,确定射入环境光传感器的光线由被遮挡状态切换为未被遮挡状态。
在步骤202中,当至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置。
当用户在当前时刻做出操作手势时,由于手指遮挡了射入环境光传感器的光线,此时,会在触摸屏上形成至少一个阴影部分。本实施例中,可以将每个阴影部分的中心位置处的环境光传感器的位置作为在当前时刻满足预设变化规则的环境光传感器的位置,即当前时刻操作手势的操作位置。
例如:如图2B所示,在某一时刻用户做出操作手势2,在终端1上形成阴影部分3,终端1获取该阴影部分3对应的至少一个环境光传感器的位置,并计算得到该至少一个环境光传感器位置的中心点为4,将该中心点4作为该时刻操作手势的操作位置。
再例如:如图2C所示,在某一时刻用户做出操作手势5,在终端1上形成阴影部分6和阴影部分7,终端1获取该阴影部分6对应的至少一个环境光传感器的位置,并计算得到该至少一个环境光传感器位置的中心点为8;获取该阴影部分7对应的至少一个环境光传感器的位置,并计算得到该至少一个环境光传感器位置的中心点为9,将中心点8和中心点9作为该时刻操作手势的操作位置。
其中,终端判断阴影部分的方法可以为将在同一时刻光强值相等,且位置连续的至少一个环境光传感器组成的区域作为一个阴影部分,本实施例不对判断阴影部分的方法作限定。
终端根据至少一个环境光传感器的位置确定出每个时刻操作手势的操作位置,根据每个时刻的操作位置可以识别出用户的操作手势。
在步骤203中,获取至少一个环境光传感器依次被遮挡的顺序。
若环境光传感器向终端上报遮挡事件,则终端可以在接收到该遮挡事件时,记录接收到该遮挡事件的时间,因此,当用户的操作手势为由连续动作产生时,如:滑动操作,终端可以获取到至少一个环境光传感器依次被遮挡的时间,根据获取的时间确定出该至少一个环境光传感器依次被遮挡的顺序。
在步骤204中,按照顺序将至少一个环境光传感器的位置识别为操作手势。
根据步骤202中确定的每个时刻操作手势的操作位置,可以得到操作手势在未接触触摸屏时的操作轨迹;根据步骤203中确定的至少一个环境光传感器依次被遮挡的顺序,可以得到操作手势的操作方向,根据该操作轨迹和该操作方向,终端可以识别出用户做出的操作手势。
例如:如图2D所示,在终端1中,第一时刻确定出操作位置11、第二时刻确定出操作位置12、第三时刻确定出操作位置13,根据确定的位置及确定位置的先后顺序识别出用户的操作手势为向右的滑动手势。
可选的,终端还可以设置第一角度阈值和第二角度阈值来确定操作手势的操作方向,其中,第二角度阈值大于第一角度阈值。终端选取任意两个时刻确定出的操作位置,若两个操作位置的连线与水平方向的夹角小于第一角度阈值,则将该操作手势识别为向左或者向右的滑动操作;若两个操作位置的连线与水平方向的夹角大于第一角度阈值小于第二角度阈值,则将该操作手势识别为斜方向的滑动操作;若两个操作位置的连线与水平方向的夹角大于第二角度阈值,则将该操作手势识别为向上或者向下的滑动操作。
例如:如图2E所示,终端在终端1中设置第一角度阈值为30度,第二角度阈值为60度,且在任意选的第一时刻确定出操作位置12、在任意选择的第二时刻确定出操作位置14,两个操作位置的连线与水平方向的夹角为45度,大于第一角度阈值小于第二角度阈值,因此,根据确定的位置、确定位置的先后顺序及位置连线与水平方向的夹角,终端识别出用户的操作手势为向右上方的滑动手势。
可选的,终端也可以计算每组相邻时刻的两个操作位置的连线与水平方向的夹角的平 均值,将该平均值与第一角度阈值和第二角度阈值进行比较,若该平均值小于第一角度阈值,识别为向左或者向右的滑动操作;若该平均值大于第一角度阈值小于第二角度阈值,识别为斜方向的滑动操作;若该平均值大于第二角度阈值,识别为向上或者向下的滑动操作。
可选的,当终端在同一时刻确定出至少两个操作位置时,对于每个操作位置,将相邻时刻距离自身最近的操作位置和自身确定为一个操作组合,根据确定的各个操作组合,识别用户的操作手势。
例如:如图2F所示,在终端1中,第一时刻确定出操作位置11和15、第二时刻确定出操作位置12和16、第三时刻确定出操作位置13和17,则终端确定操作位置11、12、13为一个操作组合;确定操作位置15、16、17为一个操作组合,根据确定的两个操作组合,识别出用户的操作手势。
可选的,为了提供更多的响应操作手势的方式,终端还可以识别出操作手势的速度,图2G是根据一示例性实施例示出的一种操作手势的速度的识别方法的流程图,该操作手势速度的识别方法应用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,如图2G所示,该操作手势速度的识别方法包括如下步骤。
在步骤205中,从至少一个环境光传感器中,获取各个环境光传感器的光强值的变化时长。
对于每个环境光传感器,在一种可能的实现方式中,终端记录环境光传感器的光强值开始变化以及停止变化时的时间,当该环境光传感器满足预设变化规则时,终端可以根据记录的时间计算该环境光传感器的光强值的变化时长。在另一种可能的实现方式中,终端可以在检测环境光传感器满足预设变化规则之前,获取该环境光传感器的光强值的变化时长,在检测到该环境光传感器满足预设变化规则时,直接读取之前获取的变化时长。本实施例不对终端获取光强值的变化时长的时机作限定。
在步骤206中,计算各个变化时长的平均值。
终端计算获取的各个变化时长的平均值,或者,终端从获取的各个变化时长中选择至少一个变化时长,计算该至少一个变化时长的平均值。
在步骤207中,根据平均值确定手势的操作速度。
终端中预先设置有时间阈值,通过比较平均值与时间阈值的大小可以确定手势的操作速度。其中,时间阈值可以为一个也可以为多个。
例如:在终端中设置两个时间阈值,分别为:第一时间阈值和第二时间阈值,且第二时间阈值小于第二时间阈值。若平均值大于第一时间阈值,则终端确定手势为慢速手势;若平均值小于第二时间阈值,则终端确定手势为快速手势;若平均值大于第二时间阈值且小于第一时间阈值,则终端确定手势为中速手势。
在步骤208中,根据操作速度确定对手势的响应方式。
通过识别操作手势的操作速度,增加了终端对于操作手势的响应方式。
例如:快速右滑手势对应的响应方式为视频快进;慢速右滑手势对应的响应方式为跳转到下一个视频。
可选的,本实施例还提供了一种识别用户做出的遮挡手势的方法,如图2H所示的一种遮挡手势的识别方法的流程图,该方法包括如下步骤。
在步骤209中,从至少一个环境光传感器中,获取各个环境光传感器的光强值变化的最小值。
若终端确定存在至少一个环境光传感器满足预设变化规则,则终端获取各个环境光传感器的光强值在变化过程中的最小值,该最小值即为射入的光线被遮挡时,环境光传感器测得的光强值。通常,各个环境光传感器测得的光强值的最小值相同。
在步骤210中,检测各个最小值是否在同一时段内保持不变。
若终端检测出在同一时刻存在至少一个光强值的最小值,则继续检测该至少一个最小值是否在相同的时间段内保持不变。
在步骤211中,当各个最小值在同一时段内保持不变时,识别出用户的遮挡手势。
终端可以在各个最小值在同一时段内保持不变时,识别出用户的操作手势为遮挡手势。终端通过识别出用户的遮挡手势,扩展了对操作手势的响应方式,例如:当终端识别出用户做出遮挡手势时,对应的响应方式为暂停播放视频或者音乐,或者,当终端识别出用户做出遮挡手势时,对应的响应方式为点击某个应用程序。
可选的,终端中还可以设置一个第一预定时间,当各个最小值保持不变的时长大于或者等于该第一预定时间时,识别出用户的操作手势为第一类遮挡手势,或者,在终端中设置一个第二预定时间,当各个最小值保持不变的时长小于等于该第二预定时间时,识别出用户的操作手势为第二类遮挡手势。终端为不同类型的遮挡手势设置不同的响应方式。
综上所述,本公开提供的手势识别方法,通过当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,若检测到射入每个环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,则根据该至少一个环境光传感器的位置识别用户的操作手势,使得用户无需在触摸屏上执行触摸操作,终端就可以识别出用户做出的操作手势,解决了当用户不方便在触摸屏上执行触摸操作时,终端无法实现识别用户做出的操作手势的问题,达到了增加手势识别的方式,提高手势识别的灵活性的效果。
此外,通过识别操作手势的速度以及遮挡手势,可以对操作手势进行扩展,解决了用户在未接触触摸屏时做出的操作手势较少,导致触摸屏对操作手势的响应方式较少的问题,达到了增加触摸屏对操作手势的响应的方式的效果。
图3是根据一示例性实施例示出的一种手势识别装置的框图,该手势识别装置应用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,如图3所示,该手势识别装置包括:第一检测模块310、第一确定模块320、第一识别模块330。
该第一检测模块310,被配置为当存在射入至少一个环境光传感器的光线被遮挡,且 未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
该第一确定模块320,被配置为当第一检测模块310检测的结果为至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置;
该第一识别模块330,被配置为根据第一确定模块320确定的至少一个环境光传感器的位置识别用户的操作手势。
综上所述,本公开提供的手势识别装置,通过当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,若检测到射入每个环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,则根据该至少一个环境光传感器的位置识别用户的操作手势,使得用户无需在触摸屏上执行触摸操作,终端就可以识别出用户做出的操作手势,解决了当用户不方便在触摸屏上执行触摸操作时,终端无法实现识别用户做出的操作手势的问题,达到了增加手势识别的方式,提高手势识别的灵活性的效果。
图4是根据一示例性实施例示出的一种手势识别装置的框图,该手势识别装置应用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,如图4所示,该手势识别装置包括:第一检测模块410、第一确定模块420、第一识别模块430。
该第一检测模块410,被配置为当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
该第一确定模块420,被配置为当第一检测模块410检测的结果为至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置;
该第一识别模块430,被配置为根据第一确定模块420确定的至少一个环境光传感器的位置识别用户的操作手势。
可选的,该第一检测模块410,包括:第一获取子模块411、检测子模块412、确定子模块413。
该第一获取子模块411,被配置为对于每个环境光传感器,获取环境光传感器测得的光强值;
该检测子模块412,被配置为检测第一获取子模块411获取的光强值是否先变小再变大;
该确定子模块413,被配置为当检测子模块412检测的结果为光强值先变小再变大时,确定射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,满足预设变化规则。
可选的,该第一识别模块430,包括:第二获取子模块431、识别子模块432。
该第二获取子模块431,被配置为获取至少一个环境光传感器依次被遮挡的顺序;
该识别子模块432,被配置为按照第二获取子模块431获取的顺序将至少一个环境光传感器的位置识别为操作手势。
可选的,该装置,还包括:第一获取模块440、计算模块450、第二确定模块460、第三确定模块470。
该第一获取模块440,被配置为从至少一个环境光传感器中,获取各个环境光传感器的光强值的变化时长;
该计算模块450,被配置为计算第一获取模块440获取的各个变化时长的平均值;
该第二确定模块460,被配置为根据计算模块450计算出的平均值确定手势的操作速度;
该第三确定模块470,被配置为根据第二确定模块460确定的操作速度确定对手势的响应方式。
可选的,该装置,还包括:第二获取模块480、第二检测模块490、第二识别模块491。
该第二获取模块480,被配置为从至少一个环境光传感器中,获取各个环境光传感器的光强值变化的最小值;
该第二检测模块490,被配置为检测第二获取模块480获取的各个最小值是否在同一时段内保持不变;
第二识别模块491,被配置为当第二检测模块490检测的结果为各个最小值在同一时段内保持不变时,识别出用户的遮挡手势。
综上所述,本公开提供的手势识别装置,通过当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,若检测到射入每个环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,则根据该至少一个环境光传感器的位置识别用户的操作手势,使得用户无需在触摸屏上执行触摸操作,终端就可以识别出用户做出的操作手势,解决了当用户不方便在触摸屏上执行触摸操作时,终端无法实现识别用户做出的操作手势的问题,达到了增加手势识别的方式,提高手势识别的灵活性的效果。
此外,通过识别操作手势的速度以及遮挡手势,可以对操作手势进行扩展,解决了用户在未接触触摸屏时做出的操作手势较少,导致触摸屏对操作手势的响应方式较少的问题,达到了增加触摸屏对操作手势的响应的方式的效果。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
本公开一示例性实施例提供了一种手势识别装置,能够实现本公开提供的手势识别方法,该手势识别装置应用于包含触摸屏的终端中,该触摸屏中分布有环境光传感器,该手势识别装置包括:处理器、用于存储处理器可执行指令的存储器;
其中,处理器被配置为:
当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于触摸屏的触摸操作时,检测该至少一个环境光传感器是否满足预设变化规则,该预设变化规则为射入环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
当至少一个环境光传感器满足预设变化规则时,确定至少一个环境光传感器的位置;
根据至少一个环境光传感器的位置识别用户的操作手势。
图5是根据一示例性实施例示出的一种用于手势识别的装置500的框图。例如,装置500可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。
参照图5,装置500可以包括以下一个或多个组件:处理组件502,存储器504,电源组件506,多媒体组件508,音频组件510,输入/输出(I/O)的接口512,传感器组件514,以及通信组件516。
处理组件502通常控制装置500的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件502可以包括一个或多个处理器518来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件502可以包括一个或多个模块,便于处理组件502和其他组件之间的交互。例如,处理组件502可以包括多媒体模块,以方便多媒体组件508和处理组件502之间的交互。
存储器504被配置为存储各种类型的数据以支持在装置500的操作。这些数据的示例包括用于在装置500上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器504可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件506为装置500的各种组件提供电力。电源组件506可以包括电源管理系统,一个或多个电源,及其他与为装置500生成、管理和分配电力相关联的组件。
多媒体组件508包括在所述装置500和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件508包括一个前置摄像头和/或后置摄像头。当装置500处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件510被配置为输出和/或输入音频信号。例如,音频组件510包括一个麦克风(MIC),当装置500处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克 风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器504或经由通信组件516发送。在一些实施例中,音频组件510还包括一个扬声器,用于输出音频信号。
I/O接口512为处理组件502和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件514包括一个或多个传感器,用于为装置500提供各个方面的状态评估。例如,传感器组件514可以检测到装置500的打开/关闭状态,组件的相对定位,例如所述组件为装置500的显示器和小键盘,传感器组件514还可以检测装置500或装置500一个组件的位置改变,用户与装置500接触的存在或不存在,装置500方位或加速/减速和装置500的温度变化。传感器组件514可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件514还可以包括环境光传感器,用于检测装置500的环境光的强度。在一些实施例中,该传感器组件514还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件516被配置为便于装置500和其他设备之间有线或无线方式的通信。装置500可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件516经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件516还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,装置500可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器504,上述指令可由装置500的处理器518执行以完成上述方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
本领域技术人员在考虑说明书及实践这里的公开后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (11)

  1. 一种手势识别方法,其特征在于,用于包含触摸屏的终端中,所述触摸屏中分布有环境光传感器,所述方法包括:
    当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于所述触摸屏的触摸操作时,检测所述至少一个环境光传感器是否满足预设变化规则,所述预设变化规则为射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
    当所述至少一个环境光传感器满足所述预设变化规则时,确定所述至少一个环境光传感器的位置;
    根据所述至少一个环境光传感器的位置识别用户的操作手势。
  2. 根据权利要求1所述的方法,其特征在于,所述检测所述至少一个环境光传感器是否满足预设变化规则,包括:
    对于每个环境光传感器,获取所述环境光传感器测得的光强值;
    检测所述光强值是否先变小再变大;
    当所述光强值先变小再变大时,确定射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,满足所述预设变化规则。
  3. 根据权利要求1所述的方法,其特征在于,所述根据所述至少一个环境光传感器的位置识别用户的操作手势,包括:
    获取所述至少一个环境光传感器依次被遮挡的顺序;
    按照所述顺序将所述至少一个环境光传感器的位置识别为所述操作手势。
  4. 根据权利要求1至3任一所述的方法,其特征在于,所述方法,还包括:
    从所述至少一个环境光传感器中,获取各个环境光传感器的光强值的变化时长;
    计算各个变化时长的平均值;
    根据所述平均值确定所述手势的操作速度;
    根据所述操作速度确定对所述手势的响应方式。
  5. 根据权利要求1所述的方法,其特征在于,所述方法,还包括:
    从所述至少一个环境光传感器中,获取各个环境光传感器的光强值变化的最小值;
    检测所述各个最小值是否在同一时段内保持不变;
    当所述各个最小值在所述同一时段内保持不变时,识别出所述用户的遮挡手势。
  6. 一种手势识别装置,其特征在于,用于包含触摸屏的终端中,所述触摸屏中分布有环境光传感器,所述装置包括:
    第一检测模块,被配置为当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于所述触摸屏的触摸操作时,检测所述至少一个环境光传感器是否满足预设变化规则,所述预设变化规则为射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
    第一确定模块,被配置为当所述第一检测模块检测的结果为所述至少一个环境光传感器满足所述预设变化规则时,确定所述至少一个环境光传感器的位置;
    第一识别模块,被配置为根据所述第一确定模块确定的所述至少一个环境光传感器的位置识别用户的操作手势。
  7. 根据权利要求6所述的装置,其特征在于,所述第一检测模块,包括:
    第一获取子模块,被配置为对于每个环境光传感器,获取所述环境光传感器测得的光强值;
    检测子模块,被配置为检测所述第一获取子模块获取的所述光强值是否先变小再变大;
    确定子模块,被配置为当所述检测子模块检测的结果为所述光强值先变小再变大时,确定射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态,满足所述预设变化规则。
  8. 根据权利要求6所述的装置,其特征在于,所述第一识别模块,包括:
    第二获取子模块,被配置为获取所述至少一个环境光传感器依次被遮挡的顺序;
    识别子模块,被配置为按照所述第二获取子模块获取的所述顺序将所述至少一个环境光传感器的位置识别为所述操作手势。
  9. 根据权利要求6至8任一所述的装置,其特征在于,所述装置,还包括:
    第一获取模块,被配置为从所述至少一个环境光传感器中,获取各个环境光传感器的光强值的变化时长;
    计算模块,被配置为计算所述第一获取模块获取的各个变化时长的平均值;
    第二确定模块,被配置为根据所述计算模块计算出的所述平均值确定所述手势的操作速度;
    第三确定模块,被配置为根据所述第二确定模块确定的所述操作速度确定对所述手势的响应方式。
  10. 根据权利要求6所述的装置,其特征在于,所述装置,还包括:
    第二获取模块,被配置为从所述至少一个环境光传感器中,获取各个环境光传感器的光强值变化的最小值;
    第二检测模块,被配置为检测所述第二获取模块获取的所述各个最小值是否在同一时段内保持不变;
    第二识别模块,被配置为当所述第二检测模块检测的结果为所述各个最小值在所述同一时段内保持不变时,识别出所述用户的遮挡手势。
  11. 一种手势识别装置,其特征在于,用于包含触摸屏的终端中,所述触摸屏中分布有环境光传感器,所述装置包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为:
    当存在射入至少一个环境光传感器的光线被遮挡,且未检测到作用于所述触摸屏的触摸操作时,检测所述至少一个环境光传感器是否满足预设变化规则,所述预设变化规则为射入所述环境光传感器的光线先由未被遮挡状态切换为被遮挡状态,再由被遮挡状态切换为未被遮挡状态;
    当所述至少一个环境光传感器满足所述预设变化规则时,确定所述至少一个环境光传感器的位置;
    根据所述至少一个环境光传感器的位置识别用户的操作手势。
PCT/CN2016/100994 2016-01-19 2016-09-30 手势识别方法及装置 WO2017124773A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
RU2017140024A RU2690202C2 (ru) 2016-01-19 2016-09-30 Способ и устройство для распознавания жеста
JP2016569698A JP6533535B2 (ja) 2016-01-19 2016-09-30 ジェスチャー識別方法、装置、プログラム及び記録媒体
KR1020177031107A KR102045232B1 (ko) 2016-01-19 2016-09-30 제스처 식별 방법, 장치, 프로그램 및 기록매체

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610035203.3A CN105511631B (zh) 2016-01-19 2016-01-19 手势识别方法及装置
CN201610035203.3 2016-01-19

Publications (1)

Publication Number Publication Date
WO2017124773A1 true WO2017124773A1 (zh) 2017-07-27

Family

ID=55719681

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/100994 WO2017124773A1 (zh) 2016-01-19 2016-09-30 手势识别方法及装置

Country Status (7)

Country Link
US (1) US20170205962A1 (zh)
EP (1) EP3196736A1 (zh)
JP (1) JP6533535B2 (zh)
KR (1) KR102045232B1 (zh)
CN (1) CN105511631B (zh)
RU (1) RU2690202C2 (zh)
WO (1) WO2017124773A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11045736B2 (en) 2018-05-02 2021-06-29 Nintendo Co., Ltd. Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method
US11103784B2 (en) 2018-05-02 2021-08-31 Nintendo Co., Ltd. Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method
US11484783B2 (en) 2018-05-02 2022-11-01 Nintendo Co., Ltd. Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511631B (zh) * 2016-01-19 2018-08-07 北京小米移动软件有限公司 手势识别方法及装置
WO2017147869A1 (zh) * 2016-03-03 2017-09-08 邱琦 光感式手势识别方法
CN106095292A (zh) * 2016-06-03 2016-11-09 上海与德通讯技术有限公司 终端设备及其操作方法
JP6838434B2 (ja) 2017-03-13 2021-03-03 オムロン株式会社 環境センサ
CN109597405A (zh) * 2017-09-30 2019-04-09 阿里巴巴集团控股有限公司 控制机器人移动的方法及机器人
GB2572978B (en) * 2018-04-18 2022-01-26 Ge Aviat Systems Ltd Method and apparatus for a display module control system
KR20200055202A (ko) * 2018-11-12 2020-05-21 삼성전자주식회사 제스처에 의해 트리거 되는 음성 인식 서비스를 제공하는 전자 장치 및 그 동작 방법
CN109558035A (zh) * 2018-11-27 2019-04-02 英华达(上海)科技有限公司 基于光线传感器的输入方法、终端设备和存储介质
CN110046585A (zh) * 2019-04-19 2019-07-23 西北工业大学 一种基于环境光的手势识别方法
CN110764611A (zh) * 2019-09-30 2020-02-07 深圳宝龙达信创科技股份有限公司 一种手势识别模块和笔记本
CN112710388B (zh) * 2019-10-24 2022-07-01 北京小米移动软件有限公司 环境光检测方法、环境光检测装置、终端设备及存储介质
CN111623392A (zh) * 2020-04-13 2020-09-04 华帝股份有限公司 一种带有手势识别组件的烟机及其控制方法
CN111596759A (zh) * 2020-04-29 2020-08-28 维沃移动通信有限公司 操作手势识别方法、装置、设备及介质
CN112019978B (zh) * 2020-08-06 2022-04-26 安徽华米信息科技有限公司 一种真无线立体声tws耳机的场景切换方法、装置及耳机
CN112433611A (zh) * 2020-11-24 2021-03-02 珠海格力电器股份有限公司 一种终端设备的控制方法以及装置
CN112947753B (zh) * 2021-02-19 2023-08-08 歌尔科技有限公司 可穿戴设备及其控制方法、可读存储介质
CN114020382A (zh) * 2021-10-29 2022-02-08 杭州逗酷软件科技有限公司 一种执行方法、电子设备及计算机存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012065516A1 (zh) * 2010-11-15 2012-05-24 惠州Tcl移动通信有限公司 一种通过手势识别实现照相机快门功能的方法及手机装置
CN103713735A (zh) * 2012-09-29 2014-04-09 华为技术有限公司 一种使用非接触式手势控制终端设备的方法和装置
CN104484032A (zh) * 2013-07-01 2015-04-01 黑莓有限公司 使用环境光传感器的手势检测
CN105511631A (zh) * 2016-01-19 2016-04-20 北京小米移动软件有限公司 手势识别方法及装置

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4721159B2 (ja) * 2005-03-28 2011-07-13 ミネベア株式会社 面状照明装置
CN100504779C (zh) * 2006-06-30 2009-06-24 联想(北京)有限公司 一种加速bios运行的方法
JP2010108081A (ja) * 2008-10-28 2010-05-13 Sharp Corp メニュー表示装置、メニュー表示装置の制御方法、およびメニュー表示プログラム
US20110242440A1 (en) * 2009-01-20 2011-10-06 Mikihiro Noma Liquid crystal display device provided with light intensity sensor
TWI590130B (zh) * 2010-07-09 2017-07-01 群邁通訊股份有限公司 可擕式電子裝置及其解鎖/翻頁方法
CN104137027B (zh) * 2011-10-10 2018-04-17 因维萨热技术公司 在空间和时间上的事件捕捉
KR101880998B1 (ko) * 2011-10-14 2018-07-24 삼성전자주식회사 이벤트 기반 비전 센서를 이용한 동작 인식 장치 및 방법
WO2013114507A1 (ja) * 2012-02-03 2013-08-08 日本電気株式会社 電磁波伝達シート、及び、電磁波伝送装置
US20150000247A1 (en) * 2013-07-01 2015-01-01 General Electric Company System and method for detecting airfoil clash within a compressor
EP2821889B1 (en) * 2013-07-01 2018-09-05 BlackBerry Limited Performance control of ambient light sensors
KR102104442B1 (ko) * 2013-07-22 2020-04-24 엘지전자 주식회사 세탁물 처리기기
US9465448B2 (en) * 2013-07-24 2016-10-11 Blackberry Limited Backlight for touchless gesture detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012065516A1 (zh) * 2010-11-15 2012-05-24 惠州Tcl移动通信有限公司 一种通过手势识别实现照相机快门功能的方法及手机装置
CN103713735A (zh) * 2012-09-29 2014-04-09 华为技术有限公司 一种使用非接触式手势控制终端设备的方法和装置
CN104484032A (zh) * 2013-07-01 2015-04-01 黑莓有限公司 使用环境光传感器的手势检测
CN105511631A (zh) * 2016-01-19 2016-04-20 北京小米移动软件有限公司 手势识别方法及装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11045736B2 (en) 2018-05-02 2021-06-29 Nintendo Co., Ltd. Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method
US11103784B2 (en) 2018-05-02 2021-08-31 Nintendo Co., Ltd. Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method
US11484783B2 (en) 2018-05-02 2022-11-01 Nintendo Co., Ltd. Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method
US11673043B2 (en) 2018-05-02 2023-06-13 Nintendo Co., Ltd. Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method

Also Published As

Publication number Publication date
RU2017140024A (ru) 2019-05-17
RU2017140024A3 (zh) 2019-05-17
CN105511631B (zh) 2018-08-07
RU2690202C2 (ru) 2019-05-31
CN105511631A (zh) 2016-04-20
EP3196736A1 (en) 2017-07-26
KR102045232B1 (ko) 2019-11-15
US20170205962A1 (en) 2017-07-20
JP2018506086A (ja) 2018-03-01
KR20170132264A (ko) 2017-12-01
JP6533535B2 (ja) 2019-06-19

Similar Documents

Publication Publication Date Title
WO2017124773A1 (zh) 手势识别方法及装置
WO2017071068A1 (zh) 应用程序切换方法、装置及设备
JP6199510B2 (ja) ディスプレイモードを切り換える方法及び装置
WO2017088579A1 (zh) 指纹识别方法及装置
JP6559769B2 (ja) ジェスチャーの識別方法及び装置
US20170344192A1 (en) Method and device for playing live videos
WO2017059638A1 (zh) 指纹识别方法及装置
US20170060320A1 (en) Method for controlling a mobile terminal using a side touch panel
WO2017177597A1 (zh) 实体按键组件、终端、触控响应方法及装置
CN111695382B (zh) 指纹采集区域确定方法和指纹采集区域确定装置
WO2017071050A1 (zh) 具有触摸屏的终端的防误触方法及装置
CN108255369B (zh) 屏内指纹图标的显示方法、装置及计算机可读存储介质
RU2683979C2 (ru) Способ и устройство для обнаружения давления
EP3232301B1 (en) Mobile terminal and virtual key processing method
US10061497B2 (en) Method, device and storage medium for interchanging icon positions
WO2017201891A1 (zh) 控制触控屏状态的方法及装置、电子设备
WO2016206295A1 (zh) 字符确定方法及装置
US20190370584A1 (en) Collecting fingerprints
WO2016095395A1 (zh) 激活移动终端的操作状态的方法及装置
US20170336962A1 (en) Gesture response method and device
US20160195992A1 (en) Mobile terminal and method for processing signals generated from touching virtual keys
CN107329604B (zh) 一种移动终端控制方法及装置
US20160173668A1 (en) Method and device for activating an operating state of a mobile terminal
CN111736718A (zh) 触摸屏控制方法及装置
WO2023230829A1 (zh) 一种触控检测方法、装置及存储介质

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016569698

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16886033

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20177031107

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2017140024

Country of ref document: RU

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16886033

Country of ref document: EP

Kind code of ref document: A1