CN112034975B - Gesture filtering method, system, device and readable storage medium - Google Patents

Gesture filtering method, system, device and readable storage medium Download PDF

Info

Publication number
CN112034975B
CN112034975B CN201910474884.7A CN201910474884A CN112034975B CN 112034975 B CN112034975 B CN 112034975B CN 201910474884 A CN201910474884 A CN 201910474884A CN 112034975 B CN112034975 B CN 112034975B
Authority
CN
China
Prior art keywords
hand
predefined
user
features
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910474884.7A
Other languages
Chinese (zh)
Other versions
CN112034975A (en
Inventor
杨星
王维辉
赵如彦
叶开远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bosch Automotive Products Suzhou Co Ltd
Original Assignee
Bosch Automotive Products Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bosch Automotive Products Suzhou Co Ltd filed Critical Bosch Automotive Products Suzhou Co Ltd
Priority to CN201910474884.7A priority Critical patent/CN112034975B/en
Publication of CN112034975A publication Critical patent/CN112034975A/en
Application granted granted Critical
Publication of CN112034975B publication Critical patent/CN112034975B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Abstract

The invention relates to a gesture filtering method, a gesture filtering system, a gesture filtering device and a readable storage medium. The method comprises the following steps: receiving one or more predefined first hand gestures, wherein each predefined first hand gesture is defined by a value of at least one hand gesture feature; obtaining glove data of a first frame; calculating the value of the hand gesture feature of the user according to the glove data of the first frame; comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined first hand poses; and according to the comparison result, when it is determined that the hand gesture of the user belongs to the one or more predefined first hand gestures, filtering glove data of the first frame and suspending an operation guidance program. With the gesture filtering method, system, device and readable storage medium, improper data can be filtered before the data is sent to a subsequent analysis algorithm.

Description

Gesture filtering method, system, device and readable storage medium
Technical Field
The present invention relates to the field of data processing, and in particular, to a gesture filtering method, system, device, and readable storage medium.
Background
In order to meet the increasing market demands of the manufacturing industry, there is a need to continuously increase the production efficiency of the manufacturing enterprises. The intelligent wearing equipment such as the intelligent glove is beneficial to the detection process flow of enterprises, improves the production efficiency, and is particularly suitable for the manual manufacturing industry and the semi-automatic manufacturing industry represented by the automobile industry. The intelligent glove is provided with the high-sensitivity sensor, so that real-time simulation of hand actions can be realized, and the labor training time is greatly shortened. In addition, the glove can assist the user to learn the correct operation steps and record the production flow. Glove data can be transmitted to a computer or smart phone via bluetooth, in contrast to stored motion patterns.
However, during waiting for the material, the user's actions may affect the analysis of the user's operations by the background, thereby generating erroneous instructions. For example, if data is included for the movement of the user's hand while waiting for material, this is not presently well addressed. The existing method may consider the hand action of the user while waiting for the material as abnormal operation.
Disclosure of Invention
In view of the above problems of the prior art, embodiments of the present application provide a gesture filtering method, system, apparatus, and readable storage medium that are capable of filtering out inappropriate data before sending the data to a subsequent analysis algorithm.
A method for filtering hand gestures of a user according to one embodiment of the present description, comprising:
receiving one or more predefined first hand gestures, wherein each predefined first hand gesture is defined by a value of at least one hand gesture feature;
obtaining glove data of a first frame;
calculating the value of the hand gesture feature of the user according to the glove data of the first frame;
comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined first hand poses; and is also provided with
According to the comparison result, when the hand gesture of the user is determined to belong to the one or more predefined first hand gestures, the glove data of the first frame is filtered, and the operation guidance program is paused.
Preferably, the method further comprises:
receiving one or more predefined second hand gestures while receiving the one or more predefined first hand gestures, wherein each predefined second hand gesture is defined by a value of at least one hand gesture feature, and wherein the one or more predefined second hand gestures are different from the one or more predefined first hand gestures;
after determining that the hand pose of the user belongs to the one or more predefined first hand poses, comparing the hand pose of the user derived from glove data of a subsequent second frame with the one or more predefined second hand poses; and is also provided with
According to the comparison result, the operation guidance program is started when it is determined that the hand gesture of the user in the second frame belongs to the one or more predefined second hand gestures.
A system for filtering hand gestures of a user according to another embodiment of the present specification, comprising:
a storage means for storing one or more predefined first hand gestures, wherein each predefined first hand gesture is defined by a value of at least one hand gesture feature;
a memory for storing instruction codes; and
at least one processor coupled to the storage device and the memory for executing the stored instruction code, the instruction code comprising:
receiving the one or more predefined first hand gestures from the storage device;
obtaining glove data of a first frame;
calculating the value of the hand gesture feature of the user according to the glove data of the first frame;
comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined first hand poses; and is also provided with
According to the comparison result, when the hand gesture of the user is determined to belong to the one or more predefined first hand gestures, the glove data of the first frame is filtered, and the operation guidance program is paused.
An apparatus for filtering hand gestures of a user according to yet another embodiment of the present specification includes:
a predefined unit for receiving one or more predefined first hand gestures, wherein each predefined first hand gesture is defined by a value of at least one hand gesture feature;
a data acquisition unit for acquiring glove data of a first frame;
a calculation unit for calculating a value of a hand posture feature of the user from the glove data of the first frame;
a comparison unit for comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined first hand poses; and
and a control unit for filtering glove data of the first frame and suspending an operation guidance program when it is determined that the hand gesture of the user belongs to the one or more predefined first hand gestures according to the comparison result.
A machine-readable storage medium according to embodiments of the present description has stored thereon executable instructions, wherein the executable instructions when executed cause a machine to perform the aforementioned method.
As can be seen from the above, the solution of the embodiments of the present specification is capable of filtering out inappropriate data and suspending the operation instruction program before sending the data to the subsequent operation analysis control algorithm; and when appropriate, restarting the operation instruction program. Therefore, compared with the prior art, the scheme of the embodiment of the specification avoids the influence of sensor data generated by hand actions of a user when waiting for materials on a subsequent analysis algorithm.
Drawings
The features, characteristics, advantages and benefits of the present invention will become apparent from the following detailed description taken in conjunction with the accompanying drawings.
FIG. 1 illustrates a flow chart of a method for filtering hand gestures of a user according to one embodiment of the present description.
FIG. 2 shows a schematic diagram of a system for filtering hand gestures of a user according to another embodiment of the present description.
Fig. 3 shows a schematic diagram of an apparatus for filtering hand gestures of a user according to a further embodiment of the present description.
List of reference numerals
10: user' s
20: user's hand
100: method for filtering hand gestures of user
110: receiving predefined first and second hand gestures
120: obtaining glove data of a first frame
130: computing values of hand gesture features of a user
140: determining whether a hand gesture of a user belongs to a predefined first hand gesture
150: filtering glove data of a first frame and pausing an operation instruction program
160: calculating values of hand gesture features of the user in a subsequent second frame
170: determining whether the hand gesture of the user in the second frame belongs to a predefined second hand gesture
180: restarting the operation guidance program
200: system for filtering hand gestures of user
210: storage device
220: memory device
230: processor and method for controlling the same
300: device for filtering hand gestures of user
310: predefined unit
320: data acquisition unit
330: calculation unit
340: comparison unit
350: control unit
Detailed Description
The subject matter described herein will now be discussed with reference to example embodiments. It should be appreciated that these embodiments are discussed only to enable a person skilled in the art to better understand and thereby practice the subject matter described herein, and are not limiting of the scope, applicability, or examples set forth in the claims. Changes may be made in the function and arrangement of elements discussed without departing from the scope of the disclosure. Various examples may omit, replace, or add various procedures or components as desired. For example, the described methods may be performed in a different order than described, and various steps may be added, omitted, or combined. In addition, features described with respect to some examples may be combined in other examples as well.
As used herein, the term "comprising" and variations thereof mean open-ended terms, meaning "including, but not limited to. The term "based on" means "based at least in part on". The terms "one embodiment" and "an embodiment" mean "at least one embodiment. The term "another embodiment" means "at least one other embodiment". The terms "first," "second," and the like, may refer to different or the same object. Other definitions, whether explicit or implicit, may be included below. Unless the context clearly indicates otherwise, the definition of a term is consistent throughout this specification.
Various embodiments of the present invention are described in detail below with reference to the attached drawing figures.
Currently, the existing intelligent glove processes the collected sensor data into gesture posture data, and then all the gesture posture data are sent to a subsequent analysis algorithm. However, the hand movement of the user while waiting for the material may be treated as an abnormal operation by the existing analysis algorithm. If the user gets off the glove or turns off the sensor on the glove every time waiting for the material in order to avoid the error data generated by the hand movement when waiting for the material, it is not practical. For this reason, the inventors of the present invention first propose the following method to solve such a problem existing in the prior art.
FIG. 1 illustrates a flowchart of a computer-implemented method 100 for filtering hand gestures of a user, according to one embodiment of the present description. The method 100 may be performed by at least one processor.
As shown in fig. 1, at step 110, one or more predefined first hand gestures are received, while one or more predefined second hand gestures are received. According to the invention, each predefined first and second hand gesture is defined by a value of at least one hand gesture feature. Preferably, the one or more predefined second hand gestures are different from the one or more predefined first hand gestures. For example, the predefined first and second hand gestures may be defined as actions that a user may achieve while waiting for a material and rarely used in the generation process. The "predefined first and second hand gestures" may be, for example, but not limited to, fist making, five finger opening, two-hand waving, and the like. In other application scenarios, other "predefined hand gestures" are also envisioned by those skilled in the art. Each hand gesture may include one or more hand motions and/or hand motions. Each hand motion and/or hand movement is represented as a hand motion feature or a predefined hand gesture feature of the hand motion feature, respectively. The hand gesture features include hand motion features and hand motion features that are each associated with one or more motion values and motion values, respectively. For example, a "predefined hand gesture" may include at least one of: a posture of one or more palms, a posture of one or more fingers, a movement of one or more palms, and/or a movement of one or more fingers. The posture of the palm is, for example, palm flattening, palm forward, palm backward, palm left, palm right, and the like. The posture of the fingers is, for example, fist making, five fingers opening, index finger extending while the remaining four fingers make a fist, etc. The movement of the palm is, for example, left and right hand movement, front and back hand movement, and the like. The movements of the fingers are, for example, the five-finger interdigitated movements, the interdigitated movements of the index finger and the middle finger, etc.
At step 120, glove data for a first frame is obtained (preferably in real time) from a sensor on the glove of user 10 (see FIG. 3), such as a MEMS (micro-electromechanical System) sensor. A smart glove (not shown) is worn on the hand 20 of the user 10. A plurality of azimuth sensors (not shown) for collecting data of azimuth angles of the phalanges and metacarpals are placed on the glove at positions corresponding to the phalanges and metacarpals to calculate azimuth angles of the respective phalanges and metacarpals. In one embodiment, the finger portion is comprised of six-axis (acceleration, gyroscope) sensors and the back of the hand is comprised of one nine-axis (acceleration, gyroscope, magnetic) sensor. From the data of these sensors, the hand posture and orientation are estimated. In one example, the step of obtaining glove data of the first frame may include obtaining azimuth data of the first frame with a plurality of azimuth sensors on the glove corresponding to phalanges and metacarpals of the hand.
The azimuth data generally refer to all data that can be used to calculate the azimuth of the carrier, such as carrier angular velocity, acceleration, etc., and can be used to derive the azimuth of the carrier. The azimuth data can be obtained through a triaxial micro gyroscope, a triaxial micro acceleration sensor and a triaxial geomagnetic sensor.
Referring to fig. 3, when a user's 10 hand 20 makes a fist (solid line), one frame of glove data is obtained by a plurality of azimuth sensors on the glove as the user makes a fist. When the user's 10 hand 20 five fingers are open (dashed line), a plurality of azimuth sensors on the glove acquire a frame of glove data when the user's five fingers are open. The glove data for each frame may include values of hand motion features or values of hand motion features. The value of each hand motion feature or hand motion feature is indicative of a current value of a corresponding one of the plurality of hand gesture features.
The values of the hand motion features may include values of one or more of the following exemplary hand motion features:
finger bending feature—one or more finger gesture features defined for each finger. For example, the finger gesture feature may be a state that may include a bend and/or curve, such as stretching, folding, and/or unfolding. Each finger (thumb, index finger, middle finger, ring finger, and/or little finger) is assigned one or more specific finger features. For example, a "fist" may be represented as a "folded" state of at least three of the thumb, index finger, middle finger, ring finger, and little finger. The "five-finger open" may be represented as an "open" state of the thumb, index finger, middle finger, ring finger, and little finger. The stretching, folding and/or unfolding state can be calculated by azimuth data acquired by a plurality of azimuth sensors arranged at corresponding positions of phalanges on the glove.
Palm pose features—the one or more palm pose features include, for example, hand selection, palm orientation, palm rotation, and/or hand position. The hand selection may identify which hand is active and the values of the hand selection may include, for example, right hand, left hand, both hands, and/or either hand. The palm direction may define a palm-facing direction of the active hand, and the values of the palm direction may include, for example, left, right, up, down, forward, and/or backward. Palm rotation may define a rotation state of the palm of the active hand, and the value of palm rotation may include, for example, left hand, right hand, up hand, down hand, forward hand, and/or backward hand. The hand position may identify a spatial position of the active hand in space, and the value of the hand position may include, for example, a center of a field of view (FOV), a right side of the FOV, a left side of the FOV, a top of the FOV, a bottom of the FOV, a front of the FOV, and/or a rear of the FOV.
Finger contact condition feature—one or more finger contact features are defined for each finger. The contact feature may define a touch condition of any two or more fingers and/or touch types, and the value of the contact feature may include, for example, a non-touch, a fingertip, and/or a full touch.
Finger relative position condition feature—one or more finger relative position features are defined for each finger. Each finger relative position condition feature may define the relative position of one finger with respect to another finger. The finger relative position features may include, for example, one or more fingers being left, right, above, below, inward, outward, forward, and/or rearward relative to the other one or more fingers.
The values of the hand motion features may include values of one or more of the following exemplary hand motion features:
motion attribute feature-the one or more motion attribute features may include, for example, motion size, motion speed, and/or motion position. The motion size may identify the size (range) of the motion, and the value of the motion size may include, for example, small, normal, and/or large. The speed of movement may define the speed of movement and the value of the speed of movement may include, for example, slow, normal, fast and/or abrupt. The motion location may identify a spatial location at which the motion is performed, and the value of the motion location may include, for example, a center of the FOV, a right side of the FOV, a left side of the FOV, a top of the FOV, a bottom of the FOV, a front of the FOV, and/or a rear of the FOV.
Motion script feature-one or more motion script features may define the actual motion performed. The motion script values may include, for example, a direction of motion, a motion start point, a motion end point, and/or a predefined curve shape. The values of the motion direction characteristics may include, for example, upward, downward, left-to-right, right-to-left, diagonally upward right, diagonally downward left, diagonally downward right, diagonally upward right, clockwise downward right, clockwise upward left, clockwise downward left, counterclockwise upward right, counterclockwise downward right, counterclockwise upward left, and/or counterclockwise downward left.
At step 130, values of hand gesture features of the user are calculated from the glove data of the first frame. In one example, the step of calculating the value of the hand gesture feature of the user may include employing the azimuth data of the first frame to calculate the value of the hand gesture feature of the user. As described above, each of the hand gestures may be defined by values of one or more hand gesture features. For example, palm rotation (as a palm posture feature) may be defined with 8 motion values-0 °, 45 °, 90 °, 135 °, 180 °, 225 °, 270 °, and 315 ° to quantify the entire rotation range of 0 ° to 360 °. For example, the hand gesture of a fist may be defined by some of the following values of hand gesture features:
the hand selection feature may be assigned a pose value to indicate either or both hands;
palm direction features may be assigned a pose value to indicate any direction;
the finger bending feature may be assigned a gesture value to indicate that the bending angle of at least three fingers, preferably five fingers, is greater than 70 °, preferably greater than 80 °;
the finger contact condition feature may be assigned a gesture value to indicate that the thumb web is in contact with the index finger back or that the thumb back is in contact with the index finger and the middle finger web; and/or
The finger relative position feature may be assigned a posture value to indicate that the thumb is located outside of the index finger or inside of the index and middle fingers.
As shown in fig. 3 with the hand 20 making a fist (solid line), the hand pose of the fist may be defined by some of the values of the hand pose features described above.
For example, the motion velocity features included in the hand motion attribute features may include up to four motion values—slow, normal, fast, and abrupt. Alternatively, the motion velocity motion attribute features may be defined with 6 motion values, 5m/s (meters/second), 10m/s, 15m/s, 20m/s, 25m/s, and 30m/s, to quantify the motion velocity of a normal human hand from 0m/s to 30 m/s. For example, a hand gesture in which the palm swings left and right may be defined by some of the values of the hand gesture features as follows:
the hand selection feature may be assigned a pose value to indicate either or both hands;
palm direction features may be assigned a pose value to indicate any direction;
the finger bending feature may be assigned a gesture value to indicate that at least three fingers, preferably five fingers, are stretched;
the finger contact condition feature may be assigned a gesture value to indicate that at least three fingers, preferably five fingers, are in contact with each other;
the finger relative position features may be assigned a gesture value to indicate that at least three fingers, preferably five fingers, are in the same plane;
the motion direction feature may be assigned a pose value to indicate palm left to right or right to left;
the motion speed feature may be assigned a pose value to indicate greater than 0m/s; and/or
The motion size feature may be assigned a pose value to indicate any size.
At step 140, the hand pose of the user in the first frame derived from the values of the hand pose features of the user is compared to one or more predefined first hand poses. In one example, comparing the hand gesture of the user with one or more predefined first hand gestures may include: the value of the hand gesture feature of the user is matched with the value of at least one hand gesture feature of each predefined first hand gesture. For example, the values of the finger bending characteristics of the user 10 acquired in one frame are compared with a predetermined threshold value (e.g., 70 ° or 80 °) to determine whether the bending angle of at least three fingers, preferably five fingers, is greater than 70 °, preferably greater than 80 °. If so, it indicates that at least one hand of the user is in a "fist-making" state. In this case, the method 100 proceeds to step 150, filters or deletes the glove data of the first frame, and pauses the operation guidance program (e.g., the takt time recognition program). That is, the glove data of the first frame is not sent to the operational analysis algorithm for subsequent processing. Similarly, it may be determined whether the user's hand pose in the first frame belongs to one or more predefined first hand poses based on the values of other hand pose features.
Conversely, if it is determined from the calculated values of the user's hand gesture features that the user's hand gesture in the first frame does not belong to any predefined first hand gestures, the method 100 will continue to determine whether the user's hand gesture in the glove data of the next frame belongs to one or more predefined first hand gestures. At the same time, for example, glove data of the first frame is sent to the operation analysis algorithm for subsequent processing, thereby guiding the user's operation.
After determining that the hand pose of the user belongs to one or more predefined first hand poses, the method 100 proceeds to step 160, continuously obtaining glove data for a next frame from sensors on the glove of the user 10 (see fig. 3); and calculates the values of the hand gesture features of the user 10 from the glove data of the next frame.
At step 170, the hand pose of user 10 derived from the glove data of the next frame is compared to one or more predefined second hand poses. If it is determined that the hand pose of user 10 in the next frame does not belong to one or more predefined second hand poses, method 100 will continue to check whether the hand pose of the user in the glove data of the subsequent frame belongs to one or more predefined second hand poses.
The operation guidance procedure is started when it is determined that the hand gesture of the user 10 in the next frame belongs to one or more predefined second hand gestures, step 180. This indicates that the user has been waiting for the end of the process of using the material to begin the production process.
Fig. 2 shows a schematic diagram of a system 200 for filtering hand gestures of a user according to another embodiment of the present description. The system 200 is described in detail below in connection with the method 100 of FIG. 1.
The system 200 includes a storage device 210, a memory 220, and at least one processor 230. As shown in fig. 2, at least one processor 230 is coupled to the storage device and the memory. The storage 210 is configured to store one or more predefined hand gestures, wherein each predefined hand gesture is defined by a value of at least one hand gesture feature. The hand gestures may be predefined according to the requirements of the application scenario and then stored in the storage means 210. Memory 220 is configured to store instruction codes. The instruction codes, when executed, cause at least one processor 230 to perform the method 100 shown in fig. 1.
Specifically, the instruction code may cause the at least one processor 230 to:
receiving or reading the one or more predefined first hand gestures, for example from storage 210;
obtaining glove data of a first frame from sensors on the user's glove;
calculating the value of the hand gesture feature of the user according to the glove data of the first frame;
comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined hand poses;
according to the comparison result, when it is determined that the hand gesture of the user belongs to the one or more predefined hand gestures, the glove data of the first frame is filtered, and an operation guidance program is paused.
Preferably, the storage means 210 may be further configured for storing one or more predefined second hand gestures, wherein each predefined second hand gesture is defined by a value of at least one hand gesture feature, and wherein the one or more predefined second hand gestures are different from the one or more predefined first hand gestures.
In this case, the instruction code may further cause the at least one processor to: receiving the one or more predefined second hand gestures from the storage device while receiving the one or more predefined first hand gestures; after determining that the hand pose of the user belongs to the one or more predefined first hand poses, comparing the hand pose of the user derived from glove data of a subsequent second frame with the one or more predefined second hand poses; and in accordance with the comparison, starting the operation guidance program when it is determined that the hand gesture of the user in the second frame belongs to the one or more predefined second hand gestures.
As described above, the one or more predefined hand gestures may preferably comprise at least one of: one or more palm postures; a gesture of one or more fingers; movement of one or more palms; and movement of one or more fingers. In one example, the hand gesture features include hand motion features and/or hand motion features. The hand motion features include at least one of finger bending features, palm posture features, finger contact condition features, and finger relative position condition features. The hand motion features include motion attribute features and/or motion script features.
In one example, the instruction code may further cause the at least one processor 230 to: obtaining azimuth data of a first or second frame from a plurality of azimuth sensors on the glove corresponding to phalanges and metacarpals of the hand; and using the azimuth data of the first or second frame to calculate a value of the hand gesture feature of the user.
In another example, the instruction code may further cause the at least one processor 230 to: the value of the hand gesture feature of the user is matched to the value of at least one hand gesture feature of each predefined first or second hand gesture.
Fig. 3 shows a schematic diagram of an apparatus 300 for filtering hand gestures of a user according to yet another embodiment of the present description. The apparatus 300 shown in fig. 3 may be implemented in software, hardware, or a combination of software and hardware. The apparatus 300 shown in fig. 3 may be integrated, for example, in the system 200 of fig. 2.
As shown in fig. 3, the apparatus 300 includes a predefined unit 310, a data acquisition unit 320, a calculation unit 330, a comparison unit 340, and a control unit 350. The predefined unit 310 is configured to receive one or more predefined first and second hand gestures, e.g. from the storage 210 in the system 200 of fig. 2, wherein each predefined first and second hand gesture is defined by a value of at least one hand gesture feature. The data acquisition unit 320 is configured to obtain glove data of a first frame from sensors on a glove (not shown) of the user 10. The calculation unit 330 is configured to calculate a value of a hand gesture feature of the user from the glove data of the first frame. The comparison unit 340 is configured for comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined first hand poses. The control unit 350 is configured for filtering the glove data of the first frame upon determining that the hand gesture of the user belongs to the one or more predefined first hand gestures and suspending the operation guidance procedure. Upon determining that the hand pose of the user does not belong to the one or more predefined hand poses, glove data of the frame is sent for subsequent processing.
Preferably, after determining that the hand gesture of the user belongs to the one or more predefined first hand gestures, the comparison unit 340 may be further configured to compare the hand gesture of the user derived from glove data of a subsequent second frame with the one or more predefined second hand gestures. The control unit 350 may be further configured to start the operation guidance program when it is determined that the hand gesture of the user in the second frame belongs to the one or more predefined second hand gestures according to a result of the comparison.
In one aspect, the data acquisition unit 320 may be further configured to acquire azimuth data for the first or second frame from a plurality of azimuth sensors on the glove corresponding to the phalanges and metacarpals of the hand. The computing unit 330 may also be configured to employ the azimuthal angle data of the first or second frame to calculate a value of the hand gesture feature of the user.
In another aspect, the comparison unit 340 may be further configured to match the value of the hand gesture feature of the user with the value of at least one hand gesture feature of each predefined first or second hand gesture.
There is also provided, in accordance with yet another embodiment of the present description, a machine-readable storage medium having stored thereon executable instructions that, when executed, cause a machine to perform the method 100 shown in fig. 1.
The detailed description set forth above in connection with the appended drawings describes exemplary embodiments, but does not represent all embodiments that may be implemented or fall within the scope of the claims. The term "exemplary" used throughout this specification means "serving as an example, instance, or illustration," and does not mean "preferred" or "advantageous over other embodiments. The detailed description includes specific details for the purpose of providing an understanding of the described technology. However, the techniques may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form in order to avoid obscuring the concepts of the described embodiments.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the scope of the disclosure. Thus, the disclosure is not limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (19)

1. A method (100) for filtering hand gestures of a user, comprising:
receiving one or more predefined first hand gestures (110), wherein each predefined first hand gesture is defined by a value of at least one hand gesture feature;
obtaining a first frame of glove data (120) from sensors on the user's glove;
calculating a value of a hand gesture feature of the user from the glove data of the first frame (130);
comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined first hand poses (140); and is also provided with
According to the comparison, when it is determined that the hand gesture of the user belongs to the one or more predefined first hand gestures, the glove data of the first frame is filtered and the operation instruction procedure is paused (150).
2. The method (100) of claim 1, further comprising:
receiving one or more predefined second hand gestures (110) while receiving the one or more predefined first hand gestures, wherein each predefined second hand gesture is defined by a value of at least one hand gesture feature, and wherein the one or more predefined second hand gestures are different from the one or more predefined first hand gestures;
after determining that the hand pose of the user belongs to the one or more predefined first hand poses, comparing (170) the hand pose of the user derived from glove data of a subsequent second frame with the one or more predefined second hand poses; and is also provided with
According to the comparison result, the operation guidance procedure (180) is started when it is determined that the hand gesture of the user in the second frame belongs to the one or more predefined second hand gestures.
3. The method (100) of claim 2,
wherein the one or more predefined first or second hand gestures comprise at least one of:
one or more palm postures;
a gesture of one or more fingers;
movement of one or more palms; and
movement of one or more fingers.
4. The method (100) according to claim 1 or 2,
wherein the hand gesture features comprise hand motion features comprising at least one of finger bending features, palm gesture features, finger contact condition features, and finger relative position condition features, and/or hand motion features comprising motion attribute features and/or motion script features.
5. The method (100) according to claim 1 or 2,
wherein the step of obtaining glove data of the first frame comprises obtaining azimuth data of the first frame by means of a plurality of azimuth sensors on the glove corresponding to phalanges and metacarpals of the hand; and is also provided with
Wherein the step of calculating the value of the hand gesture feature of the user comprises using the azimuthal data of the first frame to calculate the value of the hand gesture feature of the user.
6. The method (100) of claim 2,
wherein comparing the hand gesture of the user with the one or more predefined first or second hand gestures comprises:
the value of the hand gesture feature of the user is matched to the value of at least one hand gesture feature of each predefined first or second hand gesture.
7. A system (200) for filtering hand gestures of a user, comprising:
-storage means (210) for storing one or more predefined first hand gestures, wherein each predefined first hand gesture is defined by a value of at least one hand gesture feature;
a memory (220) for storing instruction codes; and
at least one processor (230) coupled to the storage and the memory for executing the stored instruction code, the instruction code causing the at least one processor (230) to:
receiving the one or more predefined first hand gestures from the storage (210);
obtaining glove data of a first frame;
calculating the value of the hand gesture feature of the user according to the glove data of the first frame;
comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined first hand poses; and is also provided with
According to the comparison result, when the hand gesture of the user is determined to belong to the one or more predefined first hand gestures, the glove data of the first frame is filtered, and the operation guidance program is paused.
8. The system (200) of claim 7,
wherein the storage means (210) is further configured for storing one or more predefined second hand gestures, wherein each predefined second hand gesture is defined by a value of at least one hand gesture feature, and wherein the one or more predefined second hand gestures are different from the one or more predefined first hand gestures;
wherein the instruction code further causes the at least one processor (230) to:
receiving the one or more predefined second hand gestures from the storage device while receiving the one or more predefined first hand gestures;
after determining that the hand pose of the user belongs to the one or more predefined first hand poses, comparing the hand pose of the user derived from glove data of a subsequent second frame with the one or more predefined second hand poses; and is also provided with
According to the comparison result, the operation guidance program is started when it is determined that the hand gesture of the user in the second frame belongs to the one or more predefined second hand gestures.
9. The system (200) of claim 8,
wherein the one or more predefined first or second hand gestures comprise at least one of:
one or more palm postures;
a gesture of one or more fingers;
movement of one or more palms; and
movement of one or more fingers.
10. The system (200) of claim 7 or 8,
wherein the hand gesture features comprise hand motion features comprising at least one of finger bending features, palm gesture features, finger contact condition features, and finger relative position condition features, and/or hand motion features comprising motion attribute features and/or motion script features.
11. The system (200) of claim 7 or 8,
wherein the instruction code further causes the at least one processor (230) to:
obtaining azimuth data of a first frame from a plurality of azimuth sensors on the glove corresponding to phalanges and metacarpals of the hand; and is also provided with
Values of the hand gesture features of the user are calculated using the azimuth data of the first frame.
12. The system (200) of claim 8,
wherein the instruction code further causes the at least one processor (230) to:
the value of the hand gesture feature of the user is matched to the value of at least one hand gesture feature of each predefined first or second hand gesture.
13. An apparatus (300) for filtering hand gestures of a user, comprising:
a predefined unit (310) for receiving one or more predefined first hand gestures, wherein each predefined first hand gesture is defined by a value of at least one hand gesture feature;
a data acquisition unit (320) for acquiring glove data of a first frame;
a computing unit (330) for computing a value of a hand gesture feature of the user from the glove data of the first frame; and
a comparison unit (340) for comparing the hand pose of the user in the first frame derived from the values of the hand pose features of the user with the one or more predefined first hand poses;
a control unit (350) for filtering glove data of the first frame and suspending an operation guidance program when it is determined that the hand gesture of the user belongs to the one or more predefined first hand gestures, according to a comparison result.
14. The apparatus (300) of claim 13,
wherein the predefined unit (310) is further configured to receive one or more predefined second hand gestures, wherein each predefined second hand gesture is defined by a value of at least one hand gesture feature, and wherein the one or more predefined second hand gestures are different from the one or more predefined first hand gestures;
wherein upon determining that the hand pose of the user belongs to the one or more predefined first hand poses, the comparison unit (340) is further configured to compare the hand pose of the user derived from glove data of a subsequent second frame with the one or more predefined second hand poses; and is also provided with
Wherein the control unit (350) is further configured to start the operation guidance procedure when it is determined that the hand gesture of the user in the second frame belongs to the one or more predefined second hand gestures, based on a comparison result.
15. The apparatus (300) of claim 14,
wherein the one or more predefined first or second hand gestures comprise at least one of:
one or more palm postures;
a gesture of one or more fingers;
movement of one or more palms; and
movement of one or more fingers.
16. The apparatus (300) of claim 13 or 14,
wherein the hand gesture features comprise hand motion features comprising at least one of finger bending features, palm gesture features, finger contact condition features, and finger relative position condition features, and/or hand motion features comprising motion attribute features and/or motion script features.
17. The apparatus (300) of claim 13 or 14,
wherein the data acquisition unit (320) is further configured to obtain azimuth data for a first frame from a plurality of azimuth sensors on the glove corresponding to phalanges and metacarpals of the hand; and is also provided with
Wherein the computing unit (330) is further configured to employ the azimuth data of the first frame to compute a value of the hand gesture feature of the user.
18. The apparatus (300) of claim 14,
wherein the comparison unit (340) is further configured to match the value of the hand gesture feature of the user with the value of at least one hand gesture feature of each predefined first or second hand gesture.
19. A machine-readable storage medium having stored thereon executable instructions, wherein the executable instructions when executed cause a machine to perform the method of any of claims 1-6.
CN201910474884.7A 2019-06-03 2019-06-03 Gesture filtering method, system, device and readable storage medium Active CN112034975B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910474884.7A CN112034975B (en) 2019-06-03 2019-06-03 Gesture filtering method, system, device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910474884.7A CN112034975B (en) 2019-06-03 2019-06-03 Gesture filtering method, system, device and readable storage medium

Publications (2)

Publication Number Publication Date
CN112034975A CN112034975A (en) 2020-12-04
CN112034975B true CN112034975B (en) 2024-04-02

Family

ID=73575822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910474884.7A Active CN112034975B (en) 2019-06-03 2019-06-03 Gesture filtering method, system, device and readable storage medium

Country Status (1)

Country Link
CN (1) CN112034975B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106990840A (en) * 2017-03-27 2017-07-28 联想(北京)有限公司 control method and control system
CN107272908A (en) * 2017-07-11 2017-10-20 北京奇艺世纪科技有限公司 A kind of gesture identifying device, system and gesture identification method
CN108431735A (en) * 2015-12-31 2018-08-21 微软技术许可有限责任公司 Posture vision composer tool
CN109240494A (en) * 2018-08-23 2019-01-18 京东方科技集团股份有限公司 Control method, computer readable storage medium and the control system of electronic data display
CN109416570A (en) * 2015-12-31 2019-03-01 微软技术许可有限责任公司 Use the hand gestures API of finite state machine and posture language discrete value

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8792722B2 (en) * 2010-08-02 2014-07-29 Sony Corporation Hand gesture detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108431735A (en) * 2015-12-31 2018-08-21 微软技术许可有限责任公司 Posture vision composer tool
CN109416570A (en) * 2015-12-31 2019-03-01 微软技术许可有限责任公司 Use the hand gestures API of finite state machine and posture language discrete value
CN106990840A (en) * 2017-03-27 2017-07-28 联想(北京)有限公司 control method and control system
CN107272908A (en) * 2017-07-11 2017-10-20 北京奇艺世纪科技有限公司 A kind of gesture identifying device, system and gesture identification method
CN109240494A (en) * 2018-08-23 2019-01-18 京东方科技集团股份有限公司 Control method, computer readable storage medium and the control system of electronic data display

Also Published As

Publication number Publication date
CN112034975A (en) 2020-12-04

Similar Documents

Publication Publication Date Title
US20110299737A1 (en) Vision-based hand movement recognition system and method thereof
US8339359B2 (en) Method and system for operating electric apparatus
CN104850773B (en) Method for authenticating user identity for intelligent mobile terminal
US9524003B2 (en) Input device that is worn by user and input method
CN104731307B (en) A kind of body-sensing action identification method and human-computer interaction device
CN108475113B (en) Method, system, and medium for detecting hand gestures of a user
CN110837792B (en) Three-dimensional gesture recognition method and device
WO2012164562A1 (en) Computer vision based control of a device using machine learning
CN107273869B (en) Gesture recognition control method and electronic equipment
CN104914989A (en) Gesture recognition apparatus and control method of gesture recognition apparatus
US20160320846A1 (en) Method for providing user commands to an electronic processor and related processor program and electronic circuit
EP3289435B1 (en) User interface control using impact gestures
CN110866468A (en) Gesture recognition system and method based on passive RFID
JP2017191426A (en) Input device, input control method, computer program, and storage medium
US9529445B2 (en) Input device and input method
CN112034975B (en) Gesture filtering method, system, device and readable storage medium
KR102214322B1 (en) Apparatus for providing macro function using gesture recognition and method thereof
CN111803902B (en) Swimming stroke identification method and device, wearable device and storage medium
Kao et al. Design and implementation of interaction system between humanoid robot and human hand gesture
JP2016095795A (en) Recognition device, method, and program
CN111376246A (en) Robot operation control method, gesture recognition device and robot
KR102346904B1 (en) Method and apparatus for recognizing gesture
CN117055738B (en) Gesture recognition method, wearable device and storage medium
KR102263815B1 (en) Gesture Recognition Wearable Device
US11782522B1 (en) Methods and systems for multimodal hand state prediction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant