WO2018068403A1 - Procédé et appareil de reconnaissance de geste basés sur un dispositif de commande à distance à commande tactile - Google Patents

Procédé et appareil de reconnaissance de geste basés sur un dispositif de commande à distance à commande tactile Download PDF

Info

Publication number
WO2018068403A1
WO2018068403A1 PCT/CN2016/112399 CN2016112399W WO2018068403A1 WO 2018068403 A1 WO2018068403 A1 WO 2018068403A1 CN 2016112399 W CN2016112399 W CN 2016112399W WO 2018068403 A1 WO2018068403 A1 WO 2018068403A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
touch
distance
gesture recognition
smoothed
Prior art date
Application number
PCT/CN2016/112399
Other languages
English (en)
Chinese (zh)
Inventor
杨斌
Original Assignee
深圳Tcl新技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳Tcl新技术有限公司 filed Critical 深圳Tcl新技术有限公司
Publication of WO2018068403A1 publication Critical patent/WO2018068403A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42201Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control

Definitions

  • the present invention relates to the field of remote control technologies, and in particular, to a gesture recognition method and apparatus based on a touch remote controller.
  • the existing touch-type remote controller often performs software algorithm processing when the user's finger raises the end sliding motion, and then sends a control signal indicating the recognized gesture motion to the control signal. Smart TV.
  • the touch remote control transmission time is generally 20 milliseconds, and the finger operation time varies from 180 milliseconds to 800 milliseconds, it takes at least 200 milliseconds from the user sliding the touchpad to the smart TV interface to respond to the operation command, if the finger manipulation is compared. Slow, the response time will be longer. The user must feel that the touch action of the smart TV in response to the remote control is slow, which seriously affects the user experience.
  • the main purpose of the present invention is to provide a gesture recognition method and device based on a touch remote controller, which aims to solve the problem that the existing touch remote controller needs to wait for the user's finger to lift and end the sliding motion, and then perform the gesture motion recognition to make the smart television Responding to the problem that the remote control's touch action is slow.
  • the present invention provides a gesture recognition method based on a touch remote controller, which includes the following steps:
  • Step A During the touch operation, the touch motion track is smoothed according to the point information on the touch motion track collected in real time, and the touch distance is calculated according to the smoothed touch motion track to determine whether the touch distance reaches the threshold. distance;
  • Step B When the touch distance reaches a threshold distance, perform gesture recognition according to the smoothed touch motion track, and output a gesture recognition result.
  • the smoothing process includes linear smoothing, function fitting smoothing, and exponential smoothing.
  • the step A comprises:
  • Step A1 When collecting the point information of the second point, determine a point from the line connecting the first point and the second point as a smoothed point according to a preset rule, and smooth the processed The point as the starting point for smoothing the trajectory;
  • Step A2 Determine a point from the line between the subsequently collected point and the adjacent previous point as a smoothed point according to a preset rule, and use the smoothed point as an intermediate point of the smoothed processing track;
  • Step A3 Calculate the distance from the intermediate point to the starting point, and use the distance as the touch distance;
  • step A4 it is determined whether the touch distance reaches the threshold distance. When the touch distance does not reach the threshold distance, the process returns to step A2. When the touch distance reaches the threshold distance, the process proceeds to step B.
  • the method further includes:
  • Step A5 When the distance between the intermediate point and the starting point is less than the threshold distance, predict the smoothing of the next point on the trajectory according to the intermediate point and the adjacent previous smoothed point;
  • Step A6 Calculate the distance from the predicted point to the starting point, and use the distance as the touch distance;
  • Step A 7 Determine whether the touch distance reaches the threshold distance. When the touch distance does not reach the threshold distance, return to step A2. When the touch distance reaches the threshold distance, proceed to step B.
  • the step B comprises:
  • Step B1 calculating an angle formed by the line connecting the starting point to the predicted point and the reference line, and using the angle as a linear motion angle;
  • Step B2 when the linear motion angle is within the first angular range, determining that the touch motion is an upward shift, and when the linear motion angle is within a second angular range, determining that the touch motion is a left shift And determining that the touch motion is a downward shift when the linear motion angle is within a third angular range, and determining that the touch motion is a right shift when the linear motion angle is within a fourth angular range.
  • the method further includes:
  • the starting point on the smoothing processing track is the center of the circle, and the threshold distance is used as the radius.
  • the intermediate point or the predicted point exceeds the range of the circle, it indicates that the touch distance reaches the threshold distance.
  • the method further includes:
  • the direction of the touch motion is identified as left shift, right shift, up shift or down shift, and the gesture recognition result is output.
  • the method further includes:
  • Step C When the touch distance reaches the threshold distance, the number of points from the starting point to the predicted point is N, and the N+1th point collected on the touch motion track is taken as the first point of the next piece of the touch motion track.
  • the present invention further provides a gesture recognition device based on a touch remote controller, including:
  • the calculation module is configured to smooth the touch motion track according to the point information on the touch motion track collected in real time during the touch operation, and calculate the touch distance according to the smoothed touch motion track to determine whether the touch distance is Reaching the threshold distance;
  • the identification module is configured to perform gesture recognition according to the smoothed touch motion track when the touch distance reaches a threshold distance, and output a gesture recognition result.
  • the smoothing process includes linear smoothing, function fitting smoothing, and exponential smoothing.
  • the calculation module comprises:
  • a smoothing unit configured to determine a point from the line connecting the first point and the second point as a smoothed point according to a preset rule when collecting the point information of the second point, and smooth the point
  • the processed point serves as the starting point for smoothing the trajectory
  • a calculating unit configured to calculate a distance from the intermediate point to the starting point, and use the distance as the touch distance
  • the comparing unit is configured to determine whether the touch distance reaches a threshold distance.
  • the calculation module further includes a prediction unit
  • the prediction unit is configured to: when the distance between the intermediate point and the starting point is less than the threshold distance, predict the smoothing of the next point on the trajectory according to the intermediate point and the adjacent previous smoothed point;
  • the calculating unit is further configured to calculate a distance from the predicted point to the starting point, and use the distance as the touch distance;
  • the comparing unit is further configured to determine whether the touch distance reaches a threshold distance.
  • the identification module comprises:
  • An angle unit configured to calculate an angle formed by a line connecting the starting point to the predicted point and a reference line, and using the angle as a linear motion angle
  • a determining unit configured to: when the linear motion angle is within a first angular range, determine that the touch motion is an upward shift, and when the linear motion angle is within a second angular range, determine that the touch motion is Moving to the left, when the linear motion angle is within the third angular range, determining that the touch motion is a downward shift, and when the linear motion angle is within the fourth angular range, determining that the touch motion is a right shift .
  • the identification module is further configured to use a starting point on the smoothing processing track as a center, and a threshold distance as a radius as a circle.
  • a threshold distance as a radius as a circle.
  • the identification module is further configured to identify, according to a position of the line connecting the first point and the center of the circle beyond the circle, the direction of the touch motion is left shift, right shift, up shift or down Move and output the gesture recognition result.
  • the touch remote controller-based gesture recognition device further includes:
  • a starting point module when the touch distance reaches a threshold distance, the number of points from the starting point to the predicted point is N, and the N+1th point collected on the touch motion track is taken as the first point of the next piece of the touch motion track .
  • the touch motion track is smoothed according to the point information on the touch motion track collected in real time, and the touch distance is calculated according to the smoothed touch motion track to determine whether the touch distance reaches the threshold. a distance; when the touch distance reaches a threshold distance, gesture recognition is performed according to the smoothed touch motion track, and the gesture recognition result is output.
  • the gesture recognition is performed, so that the smart TV synchronizes the touch action of the touch remote controller in the process of sliding the touch panel, and the response is rapid.
  • FIG. 1 is a schematic flow chart of a first embodiment of a gesture recognition method based on a touch remote controller according to the present invention
  • FIG. 2 is a schematic flow chart of the refinement of the first embodiment of step S10 of FIG. 1;
  • FIG. 3 is a schematic diagram of a smoothing process of a touch motion track according to an embodiment of the present invention
  • step S10 of FIG. 1 is a schematic flow chart of the second embodiment of step S10 of FIG. 1;
  • FIG. 5 is a schematic diagram of determining whether a touch distance reaches a threshold distance according to an embodiment of the present invention
  • FIG. 6 is a schematic diagram showing the refinement process of an embodiment of step S20 in FIG. 1;
  • FIG. 7 is a schematic flow chart of a second embodiment of a gesture recognition method based on a touch remote controller according to the present invention.
  • FIG. 8 is a schematic diagram of functional modules of a first embodiment of a gesture recognition device based on a touch remote controller according to the present invention.
  • FIG. 9 is a schematic diagram of a refinement function module of the first embodiment of the computing module of FIG. 8;
  • FIG. 10 is a schematic diagram of a refinement function module of the second embodiment of the computing module of FIG. 8;
  • FIG. 11 is a schematic diagram of a refinement function module of an embodiment of the identification module of FIG. 8;
  • FIG. 12 is a schematic diagram of functional modules of a second embodiment of a gesture recognition device based on a touch remote controller of the present invention.
  • the main solution of the embodiment of the present invention is: during the touch operation, the touch motion track is smoothed according to the point information on the touch motion track collected in real time, and the touch distance is calculated according to the smoothed touch motion track to determine Whether the touch distance reaches a threshold distance; when the touch distance reaches a threshold distance, the gesture recognition is performed according to the smoothed touch motion track, and the gesture recognition result is output.
  • the gesture recognition is performed, so that the smart TV synchronizes the touch action of the touch remote controller in the process of sliding the touch panel, and the response is rapid.
  • the gesture motion recognition is performed, so that the touch action of the smart television in response to the touch remote controller is slow.
  • the present invention provides a gesture recognition method based on a touch remote controller.
  • FIG. 1 is a schematic flow chart of a first embodiment of a gesture recognition method based on a touch remote controller according to the present invention.
  • the touch remote controller-based gesture recognition method includes:
  • Step S10 During the touch operation, the touch motion track is smoothed according to the point information on the touch motion track collected in real time, and the touch distance is calculated according to the smoothed touch motion track to determine whether the touch distance reaches the threshold. distance;
  • the touch remote controller includes a touch panel, and the touch panel may also be a touch screen, and may be a touch panel capable of detecting touch points, such as a resistive touch panel or a capacitive inductive touch panel.
  • the touch remote controller detects the number of touched points, and acquires the sliding of the user's finger according to the sequence of the detected touched points. Touch the motion track. Since the detected touched point may have interference or hopping, the obtained touch motion trajectory may be very unsmooth, and the speed and accuracy of the gesture recognition are greatly reduced. Therefore, an embodiment of the present invention smoothes the touch motion trajectory. To improve the speed and accuracy of gesture recognition.
  • FIG. 2 is a schematic diagram of a refinement process of the first embodiment of step S10 of FIG.
  • Step S11 when collecting the point information of the second point, determining a point from the line connecting the first point and the second point as a smoothed point according to a preset rule, and smoothing the point The point as the starting point for smoothing the trajectory;
  • Step S12 determining, according to a preset rule, a point from the line of the subsequently collected point and the adjacent previous point as the smoothed point, and using the smoothed point as the intermediate point of the smoothing processing track;
  • Step S13 calculating a distance from the intermediate point to the starting point, and using the distance as the touch distance;
  • Step S14 determining whether the touch distance reaches the threshold distance. When the touch distance does not reach the threshold distance, the process returns to step S12. When the touch distance reaches the threshold distance, the process proceeds to step S20.
  • an embodiment of the present invention adopts linear smoothing as a preset rule of smoothing processing, and when the point information of two points is collected, the first point and the first point to be collected are collected.
  • a point determined on the two points is used as a smoothed point, and the smoothed point is used as a starting point of the smoothing processing track; and the subsequently collected point is connected to the adjacent previous point.
  • the point is used as a smoothed point, and the smoothed point is used as an intermediate point of the smoothed processed track; preferably, the midpoint of two adjacent points is used as the smoothed point.
  • the smoothing process is performed in real time, and when the intermediate point is obtained, the distance between the intermediate point and the starting point is calculated, and the distance is used as the touch distance to determine whether the distance of the touch motion is reachable for gesture recognition.
  • the subsequent collected points are further smoothed to obtain an intermediate point of the smoothed processed track, and the touch distance is calculated until the touch distance reaches the threshold distance, indicating that the condition for performing gesture recognition is reached.
  • Gesture recognition is available.
  • FIG. 4 is a schematic diagram of a refinement process of the second embodiment of step S10 of FIG. Based on the first embodiment of step S10, the step S10 further includes:
  • Step S15 when the distance between the intermediate point and the starting point is less than the threshold distance, predicting the next point on the trajectory according to the intermediate point and the adjacent previous smoothed point prediction;
  • Step S16 calculating a distance from the predicted point to the starting point, and using the distance as the touch distance;
  • Step S17 determining whether the touch distance reaches the threshold distance. When the touch distance does not reach the threshold distance, the process returns to step S12. When the touch distance reaches the threshold distance, the process proceeds to step S20.
  • the smoothed points are obtained by smoothing the adjacent two points. For example, when the points P0 and P1 are collected, the starting point Q0 of the smoothed processing track is obtained, and when the point P2 is collected, The intermediate point Q1 is obtained, and so on, when the point P8 is collected, the intermediate point Q7 can be obtained, that is, the time of obtaining the smoothed point always lags behind the time when the same number of point information is collected, because one or two are collected. At the time of the point, the prediction accuracy of the touch motion track is low.
  • a point is taken as an predicted point Q8 on the extension line of the line connecting the intermediate point Q6 to the intermediate point Q7, and preferably, the distance from the predicted point Q8 to the intermediate point Q7 is the same as the distance from the intermediate point Q7 to the intermediate point Q6, and The distance from the predicted point Q8 to the starting point Q0 is calculated, and the distance is used as the touch distance to determine whether the distance of the touch motion reaches a distance at which the gesture recognition can be performed. When the predicted point-to-start distance is less than the threshold distance, the predicted point will be discarded.
  • the predicted point Q8 to the starting point Q0 is less than the threshold distance
  • the predicted point Q8 is discarded, and the point P9 is collected.
  • the intermediate point Q8 is obtained, and the touch point is continuously calculated and/or the next point on the smoothing processing track is predicted.
  • Step S20 When the touch distance reaches a threshold distance, perform gesture recognition according to the smoothed touch motion track, and output a gesture recognition result.
  • the threshold distance is a minimum distance required for gesture recognition.
  • gesture recognition is performed according to the smoothed touch motion track.
  • the starting point on the smooth processing track is The center of the circle, with a threshold distance MOV_LIMIT (for example, 80 or 90 points, etc.) as a radius, when the intermediate point or the predicted point exceeds the range of the circle, that is, the touch distance reaches a threshold distance, of course,
  • MOV_LIMIT for example, 80 or 90 points, etc.
  • the first point of the acquisition is the center of the circle.
  • the direction of the touch motion is identified as left shift, right shift, up shift or down shift, and the gesture recognition result is output.
  • FIG. 6 is a schematic diagram of a refinement process of an embodiment of step S20 in FIG.
  • Step S21 calculating an angle formed by a line connecting the starting point to the predicted point and a reference line, and using the angle as a linear motion angle;
  • Step S22 when the linear motion angle is within the first angular range, determining that the touch motion is an upward shift, and when the linear motion angle is within the second angular range, determining that the touch motion is a left shift And determining that the touch motion is a downward shift when the linear motion angle is within a third angular range, and determining that the touch motion is a right shift when the linear motion angle is within a fourth angular range.
  • the angle formed by the line connecting the starting point to the predicted point and the reference line is the angle of the linear motion during the touch operation, and the two-dimensional coordinates can be established on the touch panel and selected.
  • a certain direction is a reference line, that is, a line representing 0° in two-dimensional coordinates, and an angle formed by a line connecting the starting point to the predicted point and the reference line is taken as a linear motion angle; of course, it may also be based on the starting point and the predicted point.
  • the coordinate information can calculate the linear motion angle from the starting point to the predicted point; the direction of the touch motion is divided into left shift, right shift, up shift and down shift, and as can be seen from FIG. 5, the direction of the touch motion is arbitrary.
  • an embodiment of the present invention indicates 0° or 360° horizontally to the right, and 45° to 135° as the upper shift, 135° to 225° is divided into left shift, 225° to 315° is divided into downward shift, and 0° to 45° and 315° to 360° are divided into right shifts, so that the touch motion in each direction can be recognized as Move left, right, up or down.
  • the touch motion track is smoothed in real time, and when the touch distance reaches the threshold distance, the gesture recognition is performed, so that the smart TV synchronizes during the sliding of the touch panel by the user. Responsive to the touch action of the touch remote control, the response is rapid.
  • FIG. 7 is a schematic flowchart diagram of a second embodiment of a gesture recognition method based on a touch remote controller according to the present invention. Based on the first embodiment of the above-described touch remote control-based gesture recognition method, the method further includes:
  • Step S30 when the touch distance reaches the threshold distance, the number of points from the starting point to the predicted point is N, and the N+1th point collected on the touch motion track is taken as the first point of the next piece of the touch motion track.
  • the gesture recognition is performed according to the smoothed touch motion track, and the gesture recognition result is output, indicating that the gesture recognition is completed in one stage; in actual use, the touch motion formed by the user's one gesture motion
  • the length of the track is different, and it is likely that it has not ended when the threshold distance is reached.
  • the process of the user's finger continuing to move there may even be a turn or even a change of direction, in order to make the situation of the smart TV screen switch and the user's finger swipe. Consistently, it is necessary to continuously identify the touch motion; in an embodiment of the present invention, the number of points collected and the number of points obtained by the smoothing process are calculated.
  • the touch distance When the touch distance reaches the threshold distance, the number of points from the starting point to the predicted point is N.
  • the N+1th point acquired on the touch motion track is used as the first point of the next touch motion track, and the touch motion track is smoothed, the gesture recognition, and the gesture recognition result are output.
  • the distance between the intermediate point obtained after the smoothing process and the starting point reaches the threshold distance, and there is no predicted point, the number of the starting point to the intermediate point is calculated as N, and the N+1th obtained on the touch motion track is obtained.
  • the point is the first point of the next touch motion track.
  • the embodiment continuously smoothes the touch motion track, recognizes the gesture, and outputs the gesture recognition result, determines the gesture motion of the entire touch motion, and causes the smart TV to perform screen switching following the user's finger swipe.
  • the touch remote controller transmits the point information on the collected touch motion track to the set top box of the smart TV in real time, and the touch remote controller and the set top box transmit information through Bluetooth. Since the Bluetooth protocol specifies that information is sent every 7 ms, the touch is set.
  • the control remote controller collects the point information on the touch motion track every 7ms. Since the point information is transmitted from the touch remote controller to the set top box, it takes about 1 to 2ms, and the touch remote controller needs to wait for the previous point information transmission to be completed.
  • the point information can be collected and transmitted again, so the point information is collected once every 8 to 12 ms. Each time the point information is collected, there are touch points on the touch panel, indicating that the touch motion continues, and vice versa indicates that the touch motion ends.
  • the touch motion track is smoothed in the set top box, and the midpoint of the collected point is used as the smoothed point, and each touch point on the touch panel may be coordinate-labeled in advance, and the smoothed point may also be touched by the touch panel.
  • the coordinates above indicate that Table 1 below gives information on the acquired points and the smoothed points (including predicted points).
  • Table 1 Information on collected points and smoothed points (including predicted points)
  • the time in Table 1 is the system time when the point is collected on the touch panel.
  • the value of the threshold distance MOV_LIMIT is set to 90.
  • the distance from the intermediate point Q1 to the starting point Q0 is 11 ⁇ 90, taking the predicted point Q2 from the starting point Q0 to the extension point of the intermediate point Q1, the distance from the predicted point Q2 to the intermediate point Q1 is the same as the distance from the intermediate point Q1 to the starting point Q0, and calculated from the predicted point Q2 to the starting point
  • the distance of Q0 is 22 ⁇ 90.
  • points P2, P3, P4, and P5 are continuously collected, and intermediate points Q2, Q3, and Q4 are obtained, and predicted points Q3, Q4, and Q5, due to intermediate points Q2, Q3,
  • the distance between Q4 and predicted points Q3, Q4 and Q5 to the starting point Q0 is less than 90.
  • the intermediate point Q5 is obtained, and the distance from the intermediate point Q5 to the starting point Q0 is still less than 90, so that gesture recognition is not possible yet.
  • the coordinates of the point Q6 obtained from the intermediate points Q4 and Q5 are (138, 235), and the distance from the predicted point Q6 to the starting point Q0 is 113. > 90, the condition for performing gesture recognition is reached, and the linear motion angle from the starting point Q0 to the predicted point Q6 is calculated.
  • the distance 113 from the predicted point Q6 to the starting point Q0 is the oblique side
  • the predicted point Q6 is The distance 3 of the starting point Q0 along the Y axis is the opposite side of the right triangle, and the angle of ⁇ is calculated to be 1.5°, that is, the direction of the touch motion from P0 to P6 is right shift.
  • P7 is the first point of the next touch motion trajectory. The above process is repeated to obtain the starting point Q7, the intermediate point Q8 and the predicted point Q9 of the smoothing trajectory, due to the intermediate point Q8 and the predicted point Q9.
  • the distance of Q7 is less than 90, so gesture recognition is not yet possible.
  • the intermediate point Q9 is obtained, and the distance from the intermediate point Q9 to the starting point Q7 is calculated. From the above table, the distance from the intermediate point Q9 to the starting point Q7 is 94. > 90, the condition for performing gesture recognition is reached, and the linear motion angle of the calculation starting point Q7 to the predicted point Q9 is 14.9°, that is, the direction of the touch motion from P7 to P10 is right shift, and P10 is the first touch motion track of the next segment. Repeat the above process.
  • the invention further provides a gesture recognition device based on a touch remote controller.
  • FIG. 8 is a schematic diagram of functional modules of a first embodiment of a gesture recognition apparatus based on a touch remote controller according to the present invention.
  • the touch remote controller-based gesture recognition apparatus includes: a calculation module 10 and an identification module 20.
  • the calculating module 10 is configured to perform smoothing on the touch motion track according to the point information on the touch motion track collected in real time during the touch operation, and calculate the touch distance according to the smoothed touch motion track to determine the Whether the touch distance reaches a threshold distance;
  • the touch remote controller includes a touch panel, and the touch panel may also be a touch screen, and may be a touch panel capable of detecting touch points, such as a resistive touch panel or a capacitive inductive touch panel.
  • the touch remote controller detects the number of touched points, and acquires the sliding of the user's finger according to the sequence of the detected touched points. Touch the motion track. Since the detected touched point may have interference or hopping, the obtained touch motion trajectory may be very unsmooth, and the speed and accuracy of the gesture recognition are greatly reduced. Therefore, an embodiment of the present invention smoothes the touch motion trajectory. To improve the speed and accuracy of gesture recognition.
  • FIG. 9 is a schematic diagram of a refinement function module of the first embodiment of the computing module 10 of FIG.
  • the calculation module 10 includes a smoothing unit 11, a calculating unit 12, and a comparing unit 13.
  • the smoothing unit 11 is configured to determine a point from the line connecting the first point and the second point as a smoothed point according to a preset rule when the point information of the second point is collected, and Using the smoothed point as a starting point of the smoothing track;
  • the calculating unit 12 is configured to calculate a distance from the intermediate point to the starting point, and use the distance as the touch distance;
  • the comparing unit 13 is configured to determine whether the touch distance reaches a threshold distance.
  • an embodiment of the present invention adopts linear smoothing as a preset rule of smoothing processing, and when the point information of two points is collected, the first point and the first point to be collected are collected.
  • a point determined on the two points is used as a smoothed point, and the smoothed point is used as a starting point of the smoothing processing track; and the subsequently collected point is connected to the adjacent previous point.
  • the point is used as a smoothed point, and the smoothed point is used as an intermediate point of the smoothed processed track; preferably, the midpoint of two adjacent points is used as the smoothed point.
  • the smoothing process is performed in real time, and when the intermediate point is obtained, the distance between the intermediate point and the starting point is calculated, and the distance is used as the touch distance to determine whether the distance of the touch motion is reachable for gesture recognition.
  • the subsequent collected points are further smoothed to obtain an intermediate point of the smoothed processed track, and the touch distance is calculated until the touch distance reaches the threshold distance, indicating that the condition for performing gesture recognition is reached.
  • Gesture recognition is available.
  • FIG. 10 is a schematic diagram of a refinement function module of the second embodiment of the computing module 10 of FIG. Based on the first embodiment of the computing module 10 described above, the computing module 10 further includes: a prediction unit 14.
  • the prediction unit 14 is configured to: when the distance between the intermediate point and the starting point is less than the threshold distance, predict the smoothing of the next point on the trajectory according to the intermediate point and the adjacent previous smoothed point;
  • the calculating unit 12 is further configured to calculate a distance from the predicted point to the starting point, and use the distance as the touch distance;
  • the comparing unit 13 is further configured to determine whether the touch distance reaches a threshold distance.
  • the smoothed points are obtained by smoothing the adjacent two points. For example, when the points P0 and P1 are collected, the starting point Q0 of the smoothed processing track is obtained, and when the point P2 is collected, The intermediate point Q1 is obtained, and so on, when the point P8 is collected, the intermediate point Q7 can be obtained, that is, the time of obtaining the smoothed point always lags behind the time when the same number of point information is collected, because one or two are collected. At the time of the point, the prediction accuracy of the touch motion track is low.
  • a point is taken as an predicted point Q8 on the extension line of the line connecting the intermediate point Q6 to the intermediate point Q7, and preferably, the distance from the predicted point Q8 to the intermediate point Q7 is the same as the distance from the intermediate point Q7 to the intermediate point Q6, and The distance from the predicted point Q8 to the starting point Q0 is calculated, and the distance is used as the touch distance to determine whether the distance of the touch motion reaches a distance at which the gesture recognition can be performed. When the predicted point-to-start distance is less than the threshold distance, the predicted point will be discarded.
  • the predicted point Q8 to the starting point Q0 is less than the threshold distance
  • the predicted point Q8 is discarded, and the point P9 is collected.
  • the intermediate point Q8 is obtained, and the touch point is continuously calculated and/or the next point on the smoothing processing track is predicted.
  • the identification module 20 is configured to perform gesture recognition according to the smoothed touch motion track when the touch distance reaches a threshold distance, and output a gesture recognition result.
  • the threshold distance is a minimum distance required for gesture recognition.
  • gesture recognition is performed according to the smoothed touch motion track.
  • the starting point on the smooth processing track is The center of the circle, with a threshold distance MOV_LIMIT (for example, 80 or 90 points, etc.) as a radius, when the intermediate point or the predicted point exceeds the range of the circle, that is, the touch distance reaches a threshold distance, of course,
  • MOV_LIMIT for example, 80 or 90 points, etc.
  • the first point of the acquisition is the center of the circle.
  • the direction of the touch motion is identified as left shift, right shift, up shift or down shift, and the gesture recognition result is output.
  • FIG. 11 is a schematic diagram of a refinement function module of an embodiment of the identification module 20 of FIG.
  • the identification module 20 includes an angle unit 21 and a determination unit 22.
  • the angle unit 21 is configured to calculate an angle formed by a line connecting the starting point to the predicted point and a reference line, and use the angle as a linear motion angle;
  • the determining unit 22 is configured to: when the linear motion angle is within a first angular range, determine that the touch motion is upward, and when the linear motion angle is within a second angular range, determine the The touch motion is a left shift, and when the linear motion angle is within a third angle range, determining that the touch motion is a downward shift, and when the linear motion angle is within a fourth angular range, determining the touch motion Move to the right.
  • the touch motion track is smoothed in real time, and when the touch distance reaches the threshold distance, the gesture recognition is performed, and the gesture recognition result is output, so that the user slides the touch panel.
  • the smart TV immediately responds to the touch action of the touch remote controller and responds quickly.
  • FIG. 12 is a schematic diagram of functional modules of a second embodiment of a gesture recognition device based on a touch remote controller according to the present invention.
  • the gesture remote control based gesture recognition device further includes a starting point module 30.
  • the starting point module 30 is configured to calculate, when the touch distance reaches a threshold distance, the number of points from the starting point to the predicted point is N, and the N+1 points collected on the touch motion track are used as the one point.
  • the embodiment continuously smoothes the touch motion track, recognizes the gesture, and outputs the gesture recognition result, determines the gesture motion of the entire touch motion, and causes the smart TV to perform screen switching following the user's finger swipe.

Abstract

L'invention concerne un procédé et un appareil de reconnaissance de geste basés sur un dispositif de commande à distance à commande tactile. Le procédé comprend les étapes suivantes : pendant un processus d'opération tactile, effectuer un traitement de lissage sur une piste de mouvement tactile en fonction d'informations d'emplacement de point, collectées en temps réel, sur la piste de mouvement tactile, et calculer une distance de toucher en fonction de la piste de mouvement tactile après le traitement de lissage de façon à déterminer si la distance de toucher atteint une distance de seuil (S10); et lorsque la distance de toucher atteint la distance de seuil, effectuer une reconnaissance de geste en fonction de la piste de mouvement tactile après le traitement de lissage, et délivrer en sortie un résultat de reconnaissance de geste (S20) Une reconnaissance de geste est effectuée pendant un processus de mouvement tactile, de telle sorte que, pendant le processus d'un utilisateur coulissant un pavé tactile, une télévision intelligente répond de manière synchrone à une action tactile d'un dispositif de commande à distance à commande tactile, et la réaction est rapide.
PCT/CN2016/112399 2016-10-14 2016-12-27 Procédé et appareil de reconnaissance de geste basés sur un dispositif de commande à distance à commande tactile WO2018068403A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610899312.XA CN106658123A (zh) 2016-10-14 2016-10-14 基于触控遥控器的手势识别方法及装置
CN201610899312.X 2016-10-14

Publications (1)

Publication Number Publication Date
WO2018068403A1 true WO2018068403A1 (fr) 2018-04-19

Family

ID=58856464

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/112399 WO2018068403A1 (fr) 2016-10-14 2016-12-27 Procédé et appareil de reconnaissance de geste basés sur un dispositif de commande à distance à commande tactile

Country Status (2)

Country Link
CN (1) CN106658123A (fr)
WO (1) WO2018068403A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109542301B (zh) * 2017-09-21 2021-07-09 腾讯科技(深圳)有限公司 智能硬件控制方法和装置、存储介质及电子装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102789327A (zh) * 2012-08-07 2012-11-21 北京航空航天大学 一种基于手势的移动机器人控制方法
EP2602703A1 (fr) * 2011-12-09 2013-06-12 LG Electronics, Inc. Terminal mobile et son procédé de contrôle
CN104932769A (zh) * 2014-03-20 2015-09-23 腾讯科技(深圳)有限公司 一种网页显示方法及装置
CN105138197A (zh) * 2015-08-18 2015-12-09 南京触宏微电子有限公司 滑动轨迹的识别方法及装置
CN105744322A (zh) * 2014-12-10 2016-07-06 Tcl集团股份有限公司 一种屏幕焦点的控制方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2602703A1 (fr) * 2011-12-09 2013-06-12 LG Electronics, Inc. Terminal mobile et son procédé de contrôle
CN102789327A (zh) * 2012-08-07 2012-11-21 北京航空航天大学 一种基于手势的移动机器人控制方法
CN104932769A (zh) * 2014-03-20 2015-09-23 腾讯科技(深圳)有限公司 一种网页显示方法及装置
CN105744322A (zh) * 2014-12-10 2016-07-06 Tcl集团股份有限公司 一种屏幕焦点的控制方法及装置
CN105138197A (zh) * 2015-08-18 2015-12-09 南京触宏微电子有限公司 滑动轨迹的识别方法及装置

Also Published As

Publication number Publication date
CN106658123A (zh) 2017-05-10

Similar Documents

Publication Publication Date Title
WO2015109865A1 (fr) Système et procédé de commande personnalisée pour mode de fonctionnement de climatiseur
WO2019051887A1 (fr) Procédé et dispositif permettant de commander un appareil ménager, et support d'informations lisible par ordinateur
WO2017041337A1 (fr) Procédé, terminal et système de commande d'un conditionneur d'air
WO2016061774A1 (fr) Procédé et appareil de réglage d'itinéraire de vol
WO2015149588A1 (fr) Procédé de reconnaissance d'un mode d'exploitation d'un utilisateur sur un dispositif portatif, et dispositif portatif
WO2014194627A1 (fr) Procédé et dispositif pour commander l'affichage de contenu d'interface
WO2019051902A1 (fr) Procédé de commande de terminal, climatiseur et support d'informations lisible par un ordinateur
WO2016173259A1 (fr) Procédé d'appariement bluetooth et dispositif d'appariement bluetooth
WO2019085116A1 (fr) Dispositif et procédé de mesure de température de cuiseur à induction et support de stockage lisible
WO2016187964A1 (fr) Procédé et appareil de commande intelligente de dispositif commandé
WO2013159482A1 (fr) Procédé et dispositif d'affichage intelligent d'icône
WO2017215187A1 (fr) Procédé pour télécommande et télécommande
WO2016058258A1 (fr) Procédé et système de commande à distance de terminal
WO2018113187A1 (fr) Procédé de commande d'affichage et dispositif d'affichage
WO2014035113A1 (fr) Procédé de commande d'une fonction de toucher et dispositif électronique associé
WO2017173841A1 (fr) Procédé en fonction d'un écran tactile de commande du défilement automatique d'un livre électronique et terminal mobile
WO2016192438A1 (fr) Procédé d'activation de système d'interaction de détection de mouvement, et procédé et système d'interaction de détection de mouvement
WO2019051897A1 (fr) Procédé et dispositif de réglage de paramètre de fonctionnement de terminal, et support d'informations lisible par ordinateur
WO2017016262A1 (fr) Procédé de commande de montre intelligente, et montre intelligente
WO2014048231A1 (fr) Procédé et appareil de traitement de texte pour dispositif intelligent à écran tactile
WO2017028613A1 (fr) Procédé de commande de terminal et appareil se basant sur une application de commande à distance
WO2017084303A1 (fr) Procédé de commutation pour lanceur et système de commande de commutation
WO2017084301A1 (fr) Procédé et appareil de lecture de données audio, et téléviseur intelligent
WO2017084305A1 (fr) Procédé et système de contrôle d'un terminal
WO2019062112A1 (fr) Procédé et dispositif de commande d'un appareil de climatisation, appareil de climatisation et support lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16918602

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16918602

Country of ref document: EP

Kind code of ref document: A1