CN110674683A - Robot hand motion recognition method and system - Google Patents

Robot hand motion recognition method and system Download PDF

Info

Publication number
CN110674683A
CN110674683A CN201910753907.8A CN201910753907A CN110674683A CN 110674683 A CN110674683 A CN 110674683A CN 201910753907 A CN201910753907 A CN 201910753907A CN 110674683 A CN110674683 A CN 110674683A
Authority
CN
China
Prior art keywords
robot
motion
action
data
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910753907.8A
Other languages
Chinese (zh)
Other versions
CN110674683B (en
Inventor
李勋
刘顺桂
杨强
代思程
张裕汉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Power Supply Bureau Co Ltd
Original Assignee
Shenzhen Power Supply Bureau Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Power Supply Bureau Co Ltd filed Critical Shenzhen Power Supply Bureau Co Ltd
Priority to CN201910753907.8A priority Critical patent/CN110674683B/en
Publication of CN110674683A publication Critical patent/CN110674683A/en
Application granted granted Critical
Publication of CN110674683B publication Critical patent/CN110674683B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention relates to a robot hand motion recognition method and system. According to the method provided by the invention, effective action data are detected from the action original data by collecting the action original data of the robot hand action, then the hand action characteristic value of the robot is extracted from each effective action data, and finally the hand action type of the robot is determined according to the hand action characteristic value by using a trained recognition model, so that the automatic detection and recognition of the robot hand action are realized, and the accuracy and efficiency of the detection and recognition are improved.

Description

Robot hand motion recognition method and system
Technical Field
The invention relates to the technical field of robot control, in particular to a robot hand motion recognition method and system.
Background
When the robot executes the hand action operation, in order to ensure that the hand action operation of the robot is accurate and correct and find the misjudgment or the error of the hand action operation of the robot in time, an operator needs to observe whether the hand action of the robot meets the initial setting and the requirement, but the method has low efficiency and takes time, and the detection result often cannot achieve the expected effect.
Disclosure of Invention
Therefore, a need exists for a robot hand motion recognition method and system to achieve automatic detection and recognition of robot hand motion and improve accuracy and efficiency of detection and recognition.
The invention provides a robot hand action recognition method, which comprises the following steps:
acquiring action original data of the action of the robot hand;
detecting effective action data from the action original data;
extracting a hand motion characteristic value of the robot from each effective motion data;
and determining the hand action type of the robot according to the hand action characteristic value by using the trained recognition model.
In one embodiment, the collecting motion raw data of the robot hand motion includes:
acquiring three-axis acceleration signals and three-axis angular velocity signals corresponding to the actions of the robot hand at different moments by using a six-axis sensor, and taking the three-axis acceleration signals and the three-axis angular velocity signals as action original data of the actions of the robot hand;
and filtering the action original data, and segmenting the filtered action original data to obtain action original data segments.
In one embodiment, the detecting valid motion data from the motion raw data includes:
calculating the angular velocity motion energy of the robot hand action according to the acceleration signals of the three axes and the angular velocity signals of the three axes corresponding to the robot hand action in each action original data segment;
and judging whether the motion original data segment is effective motion data or not according to the angular velocity motion energy.
In one embodiment, the robot hand motion recognition method further includes:
and carrying out scaling processing on the angular speed signals in the effective working data according to a preset scaling factor.
In one embodiment, the extracting the hand motion feature value of the robot from each piece of valid motion data includes:
and extracting the average value of the acceleration, the variance of the acceleration, the standard deviation of the acceleration, the maximum value of the acceleration, the average value of the angular velocity, the variance of the angular velocity, the standard deviation of the angular velocity and the maximum value of the angular velocity corresponding to each axis from each effective motion data, and respectively identifying the average value, the variance of the acceleration, the standard deviation of the angular velocity and the maximum value of the angular velocity.
In one embodiment, the robot hand motion recognition method further includes:
determining the hand motion type of the robot according to the motion instruction sent to the robot and the hand motion characteristic value, and judging whether the robot hand motion operation has errors;
and when the robot hand action operation is judged to be wrong, generating alarm information for prompting a user.
Based on the same inventive concept, the invention also provides a robot hand action recognition system, which comprises:
the robot comprises an acquisition module, wherein the acquisition module is arranged on a hand part of the robot and is used for acquiring action original data of the action of the hand part of the robot; and
and the data processing platform is in communication connection with the robot and is used for receiving the motion original data of the robot hand motion, detecting effective motion data from the motion original data, extracting the hand motion characteristic value of the robot from each effective motion data, and determining the hand motion type of the robot according to the hand motion characteristic value by utilizing a trained recognition model preset in the data processing platform.
In one embodiment, the robot further comprises:
the data preprocessing module is arranged on the robot, is electrically connected with the acquisition module, and is used for receiving the action original data of the actions of the robot hand, filtering the action original data, segmenting the filtered action original data to obtain action original data fragments, and packaging the action original data fragments; and
and the communication module is arranged on the robot, is electrically connected with the data preprocessing module and is used for sending the packaged action original data fragments to the data processing platform.
In one embodiment, the acquisition module comprises a six-axis sensor, and three-axis acceleration signals and three-axis angular velocity signals corresponding to the motions of the robot hand at different moments are acquired by the six-axis sensor and are used as motion raw data of the motions of the robot hand.
In one embodiment, the data processing platform comprises:
the judging module is in communication connection with the robot through the communication module and is used for calculating the angular velocity movement energy of the robot hand action according to the acceleration signals of three axes and the angular velocity signals of the three axes corresponding to the robot hand action in each action original data segment and judging whether the action original data segment is effective action data or not according to the angular velocity movement energy;
the action characteristic value extraction module is electrically connected with the judgment module and is used for extracting the average value of the acceleration, the variance of the acceleration, the standard deviation of the acceleration, the maximum value of the acceleration, the average value of the angular velocity, the variance of the angular velocity, the standard deviation of the angular velocity and the maximum value of the angular velocity corresponding to each axis from each effective action data and respectively identifying the average value, the variance of the acceleration, the maximum value of the acceleration, the average value of the angular velocity, the variance of the angular velocity; and
and the motion type identification module and the motion characteristic value extraction module are used for determining the hand motion type of the robot by utilizing the trained identification model according to the hand motion characteristic value.
In one embodiment, the determining module is further configured to:
before the effective action data are sent to the action characteristic value extraction module, the angular speed signals in the effective working data are subjected to scaling processing according to a preset scaling factor.
In one embodiment, the action type recognition module is further configured to:
determining the hand motion type of the robot according to the motion instruction sent to the robot and the hand motion characteristic value, and judging whether the robot hand motion operation has errors;
and when the robot hand action operation is judged to be wrong, generating alarm information for prompting a user.
In summary, the embodiment of the invention provides a robot hand motion recognition method and system. According to the method, effective action data are detected from the action raw data by collecting the action raw data of the robot hand action, the hand action characteristic value of the robot is further extracted from each effective action data, and finally the hand action type of the robot is determined according to the hand action characteristic value by using a trained recognition model. It can be understood that, since the machine learning can automatically learn the hand motion characteristic values and identify the motion types according to the hand motion characteristic values, the hand motion types of the robot can be identified by inputting the hand motion characteristic values of the robot into the trained identification model, so that the automatic detection and identification of the robot hand motions are realized, and the accuracy and efficiency of the detection and identification are improved.
Drawings
Fig. 1 is a schematic flow chart of a robot hand motion recognition method according to an embodiment of the present invention;
fig. 2 is a flowchart of learning training and recognition performed by the robot according to the embodiment of the present invention;
FIG. 3 is a graph of the acceleration of the elbow type motion of a robot hand along the X, Y and Z axes obtained from the sensors in accordance with an embodiment of the present invention;
FIG. 4 is a schematic diagram of hand motion based on angular velocity signal extraction of hand motion;
fig. 5 is an electrical schematic diagram of a robot hand motion recognition system according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein, but rather should be construed as broadly as the present invention is capable of modification in various respects, all without departing from the spirit and scope of the present invention.
Referring to fig. 1 and fig. 2, an embodiment of the present invention provides a robot hand motion recognition method, including:
step S120, collecting motion original data of robot hand motion;
step S130, detecting effective action data from the action original data;
step S140, extracting a hand motion characteristic value of the robot from each effective motion data;
and S150, determining the hand motion type of the robot according to the hand motion characteristic value by using the trained recognition model.
The robot hand motion recognition system has the advantages that accuracy and efficiency of manual detection and recognition of robot hand motions are poor, operation of robot hand motions can be measured by means of the sensor, the robot hand motions are reflected on analog signals and are converted into digital quantities through the digital-to-analog conversion module and transmitted to the data processing platform, the trained recognition model is arranged in the data processing platform, the hand motion characteristic values can be automatically learned, motion types can be recognized according to the hand motion characteristic values, the hand motion types of the robot can be recognized by inputting the hand motion characteristic values of the robot into the trained recognition model, automatic detection and recognition of the robot hand motions are achieved, and accuracy and efficiency of detection and recognition are improved. In this embodiment, the recognition model is a decision tree, naive bayes, or K-neighborhood classification algorithm model.
In one embodiment, the collecting motion raw data of the robot hand motion includes:
acquiring three-axis acceleration signals and three-axis angular velocity signals corresponding to the actions of the robot hand at different moments by using a six-axis sensor, and taking the three-axis acceleration signals and the three-axis angular velocity signals as action original data of the actions of the robot hand;
and filtering the action original data, and segmenting the filtered action original data to obtain action original data segments.
Referring to fig. 3, in the embodiment, the six-axis sensor is an MPU6050 six-axis sensor, the MPU6050 six-axis sensor is used to periodically collect motion raw data of various hand motions of the robot on the robot hand, and the collection set baud rate is 8000-10000, so that enough data points are collected to identify the hand motions, thereby providing identification accuracy. And then, filtering the collected motion original data to eliminate the influence of factors such as hand trembling and environmental noise on the recognition effect, then segmenting the filtered motion original data to obtain motion original data fragments, and packaging and sending the motion original data fragments to a data processing platform.
In one embodiment, the detecting valid motion data from the motion raw data includes:
calculating the angular velocity motion energy of the robot hand action according to the acceleration signals of the three axes and the angular velocity signals of the three axes corresponding to the robot hand action in each action original data segment;
and judging whether the motion original data segment is effective motion data or not according to the angular velocity motion energy.
It can be understood that, because the motion raw data obtained by using the six-axis sensor is only the sensing data of a series of motions, which includes some motion-irrelevant data, the motion-irrelevant data can be eliminated, and the data calculation amount is reduced, so as to improve the efficiency of hand motion recognition. Referring to fig. 4, in the embodiment, the angular velocity motion energy is a sum of squares of angular velocity signals on axes of the six-axis sensor X, Y, Z, when a hand of the robot, such as an elbow, turns upward, downward, leftward or rightward, the angular velocity becomes larger, and the angular velocity motion energy is correspondingly increased, so that whether the motion raw data segment is valid motion data or not can be determined according to the angular velocity motion energy.
In one embodiment, the robot hand motion recognition method further includes:
and carrying out scaling processing on the angular speed signals in the effective working data according to a preset scaling factor.
It can be understood that since the magnitude range of the motion raw data value outputted from the MPU6050 six-axis acceleration sensor is [ -32768,32768], the angular velocity signal value is multiplied by a scaling factor for the convenience of calculation without affecting the achieved effect. Usually, the scaling factor value range is 1/400 to 1/600, and in this embodiment, the preset scaling factor is 1/500.
In one embodiment, the extracting the hand motion feature value of the robot from each piece of valid motion data includes:
and extracting the average value of the acceleration, the variance of the acceleration, the standard deviation of the acceleration, the maximum value of the acceleration, the average value of the angular velocity, the variance of the angular velocity, the standard deviation of the angular velocity and the maximum value of the angular velocity corresponding to each axis from each effective motion data, and respectively identifying the average value, the variance of the acceleration, the standard deviation of the angular velocity and the maximum value of the angular velocity.
In this embodiment, feature values of each hand motion, that is, an average value, a variance, a standard deviation, and a maximum value of X, Y, Z six values acc _ x, acc _ y, acc _ z, gry _ x, gry _ y, and gry _ z on three axes in the six-axis acceleration sensor of the MPU6050 are extracted from the extracted effective motion data, and are respectively identified for the subsequent training of a recognition model based on a decision tree, naive bayes, or K-neighbor classification algorithm, where acc _ x, acc _ y, and acc _ z are acceleration values of the human hand motion on X, Y, Z three axes, and gry _ x, gry _ y, and gry _ z are angular velocity values of the human hand motion on X, Y, Z three axes.
In one embodiment, the robot hand motion recognition method further includes:
determining the hand motion type of the robot according to the motion instruction sent to the robot and the hand motion characteristic value, and judging whether the robot hand motion operation has errors;
and when the robot hand action operation is judged to be wrong, generating alarm information for prompting a user.
Based on the same inventive concept, the embodiment of the present invention further provides a robot hand motion recognition system, please refer to fig. 5, which includes a robot 510 and a data processing platform 520.
The robot 510 comprises an acquisition module 511, wherein the acquisition module 511 is arranged on a hand of the robot 510 and is used for acquiring motion original data of motions of the hand of the robot 510.
The data processing platform 520 is communicatively connected to the robot 510, and is configured to receive raw motion data of the hand motion of the robot 510, detect valid motion data from the raw motion data, extract a characteristic value of the hand motion of the robot 510 from each of the valid motion data, and determine a type of the hand motion of the robot 510 according to the characteristic value of the hand motion by using a trained recognition model preset in the robot.
In this embodiment, the hand motion characteristic value of the robot 510 is input into the trained recognition model, so that the hand motion type of the robot 510 can be recognized, the automatic detection and recognition of the hand motion of the robot 510 are realized, and the accuracy and efficiency of the detection and recognition are improved. In this embodiment, the recognition model is a decision tree, naive bayes, or K-neighborhood classification algorithm model.
In one embodiment, the robot 510 further comprises a data preprocessing module 512 and a communication module 513.
The data preprocessing module 512 is disposed on the robot 510, electrically connected to the acquisition module 511, and configured to receive motion raw data of a hand motion of the robot 510, perform filtering processing on the motion raw data, and segment the filtered motion raw data to obtain a motion raw data segment, and package the motion raw data segment.
The communication module 513 is disposed on the robot 510, electrically connected to the data preprocessing module 512, and configured to send the packaged action raw data segment to the data processing platform 520.
In this embodiment, the collected motion raw data is filtered to eliminate the influence of factors such as hand trembling and environmental noise on the recognition effect, and then the filtered motion raw data is divided to obtain motion raw data fragments, and the motion raw data fragments are packed and sent to the data processing platform 520.
In one embodiment, the acquisition module 511 includes six-axis sensors, and three-axis acceleration signals and three-axis angular velocity signals corresponding to hand motions of the robot 510 at different times are acquired by the six-axis sensors and are used as motion raw data of the hand motions of the robot 510.
In this embodiment, the six-axis sensor is an MPU6050 six-axis sensor, the MPU6050 six-axis sensor is used to periodically collect the motion raw data of various hand motions of the robot 510 on the hands of the robot 510, and the set baud rate for collection is 8000-10000, so that enough data points are collected to identify the hand motions, and the identification accuracy is provided.
In one embodiment, the data processing platform 520 includes a determination module 521, an action feature value extraction module 522, and an action type identification module 523.
The judging module 521 is in communication connection with the robot 510 through the communication module 513, and is configured to calculate angular velocity motion energy of the hand motion of the robot 510 according to three axis acceleration signals and three axis angular velocity signals corresponding to the hand motion of the robot 510 in each motion raw data segment, and judge whether the motion raw data segment is valid motion data according to the angular velocity motion energy.
The motion characteristic value extracting module 522 is electrically connected to the determining module 521, and is configured to extract an average value of the acceleration, a variance of the acceleration, a standard deviation of the acceleration, a maximum value of the acceleration, an average value of the angular velocity, a variance of the angular velocity, a standard deviation of the angular velocity, and a maximum value of the angular velocity corresponding to each axis from each effective motion data, and identify the average value, the variance, the standard deviation, and the maximum value of the angular velocity.
The motion type recognition module 523 and the motion feature value extraction module 522 are configured to determine the hand motion type of the robot 510 according to the hand motion feature value by using a trained recognition model.
In this embodiment, since the motion raw data obtained by using the six-axis sensor is only the sensing data of a series of motions, including some motion-irrelevant data, the motion-irrelevant data can be eliminated, and the data calculation amount is reduced, so as to improve the efficiency of hand motion recognition. Referring to fig. 4, in the embodiment, the angular velocity motion energy is a sum of squares of angular velocity signals on axes of the six-axis sensor X, Y, Z, when a hand, such as an elbow, of the robot 510 turns upward, downward, leftward or rightward, the angular velocity becomes larger, and the angular velocity motion energy correspondingly increases, so that whether the motion raw data segment is valid motion data can be determined according to the angular velocity motion energy.
Further, from the extracted effective motion data, feature values of each hand motion, that is, an average value, a variance, a standard deviation, and a maximum value of X, Y, Z six values acc _ x, acc _ y, acc _ z, gry _ x, gry _ y, gry _ z on three axes in the MPU6050 six-axis acceleration sensor, are extracted, and 24 feature values are identified respectively for use in the subsequent training of a recognition model based on a decision tree, naive bayes, or K-neighbor classification algorithm, where acc _ x, acc _ y, acc _ z are acceleration values of the human hand motion on X, Y, Z three axes, and gry _ x, gry _ y, gry _ z are angular velocity values of the human hand motion on X, Y, Z three axes.
In addition, the characteristic value of the identified hand motion can be used as a sample, and the trained recognition model can be optimized, so that the recognition accuracy and efficiency of the recognition model are further improved.
In one embodiment, the determining module 521 is further configured to:
before sending the effective motion data to the motion characteristic value extraction module 522, performing scaling processing on the angular velocity signal in the effective working data according to a preset scaling factor.
It can be understood that since the magnitude range of the motion raw data value outputted from the MPU6050 six-axis acceleration sensor is [ -32768,32768], the angular velocity signal value is multiplied by a scaling factor for the convenience of calculation without affecting the achieved effect. Usually, the scaling factor value range is 1/400 to 1/600, and in this embodiment, the preset scaling factor is 1/500.
In one embodiment, the action type recognition module 523 is further configured to:
determining the hand action type of the robot 510 according to the action instruction sent to the robot 510 and the hand action characteristic value, and judging whether the hand action operation of the robot 510 has errors;
and when the robot 510 is judged to have an error in the hand motion operation, generating alarm information for prompting the user.
In one embodiment, the hand motion recognition system of the robot 510 further includes a display screen (not shown) electrically connected to the motion type recognition module 523, and configured to receive the warning message and generate a buzzer or/and a word prompt according to the message warning message.
In summary, the embodiment of the invention provides a robot hand motion recognition method and system. According to the method, effective action data are detected from the action raw data by collecting the action raw data of the robot hand action, the hand action characteristic value of the robot is further extracted from each effective action data, and finally the hand action type of the robot is determined according to the hand action characteristic value by using a trained recognition model. It can be understood that, since the machine learning can automatically learn the hand motion characteristic values and identify the motion types according to the hand motion characteristic values, the hand motion types of the robot can be identified by inputting the hand motion characteristic values of the robot into the trained identification model, so that the automatic detection and identification of the robot hand motions are realized, and the accuracy and efficiency of the detection and identification are improved.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A robot hand motion recognition method is characterized by comprising the following steps:
acquiring action original data of the action of the robot hand;
detecting effective action data from the action original data;
extracting a hand motion characteristic value of the robot from each effective motion data;
and determining the hand action type of the robot according to the hand action characteristic value by using the trained recognition model.
2. The robot hand motion recognition method of claim 1, wherein the collecting of motion raw data of the robot hand motion comprises:
acquiring three-axis acceleration signals and three-axis angular velocity signals corresponding to the actions of the robot hand at different moments by using a six-axis sensor, and taking the three-axis acceleration signals and the three-axis angular velocity signals as action original data of the actions of the robot hand;
and filtering the action original data, and segmenting the filtered action original data to obtain action original data segments.
3. The robot hand motion recognition method according to claim 2, wherein the detecting valid motion data from the motion raw data includes:
calculating the angular velocity motion energy of the robot hand action according to the acceleration signals of the three axes and the angular velocity signals of the three axes corresponding to the robot hand action in each action original data segment;
and judging whether the motion original data segment is effective motion data or not according to the angular velocity motion energy.
4. A robot hand motion recognition method according to claim 3, further comprising:
and carrying out scaling processing on the angular speed signals in the effective working data according to a preset scaling factor.
5. A robot hand motion recognition method according to claim 2, wherein said extracting a hand motion feature value of the robot from each of the effective motion data includes:
and extracting the average value of the acceleration, the variance of the acceleration, the standard deviation of the acceleration, the maximum value of the acceleration, the average value of the angular velocity, the variance of the angular velocity, the standard deviation of the angular velocity and the maximum value of the angular velocity corresponding to each axis from each effective motion data, and respectively identifying the average value, the variance of the acceleration, the standard deviation of the angular velocity and the maximum value of the angular velocity.
6. The robot hand motion recognition method of claim 1, further comprising:
determining the hand motion type of the robot according to the motion instruction sent to the robot and the hand motion characteristic value, and judging whether the robot hand motion operation has errors;
and when the robot hand action operation is judged to be wrong, generating alarm information for prompting a user.
7. A robot hand motion recognition system, comprising:
the robot comprises an acquisition module, wherein the acquisition module is arranged on a hand part of the robot and is used for acquiring action original data of the action of the hand part of the robot; and
and the data processing platform is in communication connection with the robot and is used for receiving the motion original data of the robot hand motion, detecting effective motion data from the motion original data, extracting the hand motion characteristic value of the robot from each effective motion data, and determining the hand motion type of the robot according to the hand motion characteristic value by utilizing a trained recognition model preset in the data processing platform.
8. A robotic hand motion recognition system as claimed in claim 7 wherein the robot further comprises:
the data preprocessing module is arranged on the robot, is electrically connected with the acquisition module, and is used for receiving the action original data of the actions of the robot hand, filtering the action original data, segmenting the filtered action original data to obtain action original data fragments, and packaging the action original data fragments; and
and the communication module is arranged on the robot, is electrically connected with the data preprocessing module and is used for sending the packaged action original data fragments to the data processing platform.
9. The robot hand motion recognition system of claim 8, wherein the acquisition module comprises a six-axis sensor, and three-axis acceleration signals and three-axis angular velocity signals corresponding to the robot hand motion at different times are acquired by the six-axis sensor and are used as motion raw data of the robot hand motion.
10. A robotic hand motion recognition system as claimed in claim 9 wherein the data processing platform comprises:
the judging module is in communication connection with the robot through the communication module and is used for calculating the angular velocity movement energy of the robot hand action according to the acceleration signals of three axes and the angular velocity signals of the three axes corresponding to the robot hand action in each action original data segment and judging whether the action original data segment is effective action data or not according to the angular velocity movement energy;
the action characteristic value extraction module is electrically connected with the judgment module and is used for extracting the average value of the acceleration, the variance of the acceleration, the standard deviation of the acceleration, the maximum value of the acceleration, the average value of the angular velocity, the variance of the angular velocity, the standard deviation of the angular velocity and the maximum value of the angular velocity corresponding to each axis from each effective action data and respectively identifying the average value, the variance of the acceleration, the maximum value of the acceleration, the average value of the angular velocity, the variance of the angular velocity; and
and the motion type identification module and the motion characteristic value extraction module are used for determining the hand motion type of the robot by utilizing the trained identification model according to the hand motion characteristic value.
11. The robotic hand motion recognition system according to claim 10, wherein the determination module is further to:
before the effective action data are sent to the action characteristic value extraction module, the angular speed signals in the effective working data are subjected to scaling processing according to a preset scaling factor.
12. The robotic hand motion recognition system according to claim 10, wherein the motion type recognition module is further to:
determining the hand motion type of the robot according to the motion instruction sent to the robot and the hand motion characteristic value, and judging whether the robot hand motion operation has errors;
and when the robot hand action operation is judged to be wrong, generating alarm information for prompting a user.
CN201910753907.8A 2019-08-15 2019-08-15 Robot hand motion recognition method and system Active CN110674683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910753907.8A CN110674683B (en) 2019-08-15 2019-08-15 Robot hand motion recognition method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910753907.8A CN110674683B (en) 2019-08-15 2019-08-15 Robot hand motion recognition method and system

Publications (2)

Publication Number Publication Date
CN110674683A true CN110674683A (en) 2020-01-10
CN110674683B CN110674683B (en) 2022-07-22

Family

ID=69075363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910753907.8A Active CN110674683B (en) 2019-08-15 2019-08-15 Robot hand motion recognition method and system

Country Status (1)

Country Link
CN (1) CN110674683B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184325A (en) * 2015-09-23 2015-12-23 歌尔声学股份有限公司 Human body action recognition method and mobile intelligent terminal
CN105617638A (en) * 2015-12-25 2016-06-01 深圳市酷浪云计算有限公司 Badminton racket swinging movement recognizing method and device
CN106919958A (en) * 2017-03-21 2017-07-04 电子科技大学 A kind of human finger action identification method based on intelligent watch
CN107220617A (en) * 2017-05-25 2017-09-29 哈尔滨工业大学 Human body attitude identifying system and method
CN108363959A (en) * 2018-01-22 2018-08-03 河海大学常州校区 One kind being directed to table tennis or badminton action identification method
CN108509897A (en) * 2018-03-29 2018-09-07 同济大学 A kind of human posture recognition method and system
CN108898062A (en) * 2018-05-31 2018-11-27 电子科技大学 A kind of hand motion recognition method based on improved signal segment extraction algorithm

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184325A (en) * 2015-09-23 2015-12-23 歌尔声学股份有限公司 Human body action recognition method and mobile intelligent terminal
CN105617638A (en) * 2015-12-25 2016-06-01 深圳市酷浪云计算有限公司 Badminton racket swinging movement recognizing method and device
CN106919958A (en) * 2017-03-21 2017-07-04 电子科技大学 A kind of human finger action identification method based on intelligent watch
CN107220617A (en) * 2017-05-25 2017-09-29 哈尔滨工业大学 Human body attitude identifying system and method
CN108363959A (en) * 2018-01-22 2018-08-03 河海大学常州校区 One kind being directed to table tennis or badminton action identification method
CN108509897A (en) * 2018-03-29 2018-09-07 同济大学 A kind of human posture recognition method and system
CN108898062A (en) * 2018-05-31 2018-11-27 电子科技大学 A kind of hand motion recognition method based on improved signal segment extraction algorithm

Also Published As

Publication number Publication date
CN110674683B (en) 2022-07-22

Similar Documents

Publication Publication Date Title
US10324425B2 (en) Human collaborative robot system having improved external force detection accuracy by machine learning
CN109397703B (en) Fault detection method and device
US20110144543A1 (en) Behavior recognition apparatus
CN110514957A (en) Substation's automatic detecting method and platform
CN109677341A (en) A kind of information of vehicles blending decision method and device
CN210402266U (en) Sign language translation system and sign language translation gloves
US11544554B2 (en) Additional learning method for deterioration diagnosis system
CN113485302A (en) Vehicle operation process fault diagnosis method and system based on multivariate time sequence data
US20230162484A1 (en) Apparatus and method for generating learning data for artificial intelligence model
CN107358248A (en) A kind of method for improving fall detection system precision
CN110553789A (en) state detection method and device of piezoresistive pressure sensor and brake system
CN106708009A (en) Ship dynamic positioning measurement system multiple-fault diagnosis method based on support vector machine clustering
CN115307896A (en) Equipment health state detection method based on machine learning
CN110674683B (en) Robot hand motion recognition method and system
CN101673449A (en) Method for detecting operation of worker and alarming based on three-dimensional position sensing device
CN113001546B (en) Method and system for improving motion speed safety of industrial robot
CN114846513A (en) Motion analysis system and motion analysis program
CN116625683A (en) Wind turbine generator system bearing fault identification method, system and device and electronic equipment
JP2020107248A (en) Abnormality determination device and abnormality determination method
CN109117719A (en) Driving gesture recognition method based on local deformable partial model fusion feature
CN111730604B (en) Mechanical clamping jaw control method and device based on human body electromyographic signals and electronic equipment
CN112893180A (en) Object touch classification method and system considering friction coefficient abnormal value elimination
CN113172663A (en) Manipulator grabbing stability identification method and device and electronic equipment
CN114626162B (en) Quantitative recognition method for loss degree of contact ball bearing
CN115655576B (en) Automatic sensing method for displacement abnormity of pointer type pressure gauge

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant