CN104778746A - Method for performing accurate three-dimensional modeling based on data glove by using natural gestures - Google Patents

Method for performing accurate three-dimensional modeling based on data glove by using natural gestures Download PDF

Info

Publication number
CN104778746A
CN104778746A CN201510114915.XA CN201510114915A CN104778746A CN 104778746 A CN104778746 A CN 104778746A CN 201510114915 A CN201510114915 A CN 201510114915A CN 104778746 A CN104778746 A CN 104778746A
Authority
CN
China
Prior art keywords
mouth
data
gesture
lilypad
arduino
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510114915.XA
Other languages
Chinese (zh)
Other versions
CN104778746B (en
Inventor
厉向东
吕士宏
王怡堃
孙小我
张驰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201510114915.XA priority Critical patent/CN104778746B/en
Publication of CN104778746A publication Critical patent/CN104778746A/en
Application granted granted Critical
Publication of CN104778746B publication Critical patent/CN104778746B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a method for performing accurate three-dimensional modeling based on a data glove by using natural gestures. LilyPad Arduino is additionally arranged outside the glove as a processing chip, hand gestures are monitored through bending sensors, FSR402 type resistive thin film pressure sensors and a JY-901 module, and LilyPad XBee and an XBee 1mW Wire Antenna wireless module perform two-way communication with a computer. Hand coordinates obtained by somatosensory equipment are fused with motion data, so that a human body natural language can be understood by the computer, the measurement accuracy can reach that of a mouse, and the three-dimensional modeling of the user with the natural language can be implemented.

Description

A kind of method of carrying out accurate three-dimensional modeling based on data glove use nature gesture
Technical field
The present invention relates to a kind of method of carrying out accurate three-dimensional modeling based on data glove use nature gesture.
Background technology
At present, the wearable sensors such as bend sensor, accelerometer by Combination application detect that basic natural gesture such as points bending and can stretch.Catching and machine learning simultaneously in conjunction with a large amount of natural gesture data, can carry out the gesture recognition of basic finger-joint precision at present.But the spatial movement due to hand can make three axis accelerometer produce cumulative errors effects, thus cause cannot the space motion path of accurate reproduction hand, and directly affects the judgement of hand and user's body relative position relation.This makes current data glove very limitedly can only be applied to the gesture recognition of simple finger motion, cannot intactly support nature gesture interaction.
On the other hand, the infrared ray after current body sense interaction technique uses coding is found range, and the light pattern then reflected by sensor measurement carries out the detection of distance.The precision of this kind of technology is many in millimeter rank, and due to the restriction of minimum service range, user needs to stand distant.This makes the gesture that captures very accurate, and owing to lacking effective machine learning algorithm, can only identify and simply hold tight and open two kinds of postures.In addition, when hand is partly or entirely blocked, this kind of technology just cannot correctly catch and detecting distance and posture.Therefore, current body sense technology is very limited in the application in accurate three-dimensional modeling field.
The sign language of single use body sense equipment None-identified complexity, does not reach the requirement of three-dimensional modeling yet to the measuring accuracy of hand.And single data glove is limited owing to cannot reduce the gesture that movement locus and position relationship make it to identify, and owing to only using hand positions but not the analysis of whole body bone, make it mutual very single, the effect being applied in three-dimensional modeling is poor.No matter be simultaneously body sense technology or traditional gesture gloves, the problem and lacking that all cannot solve a large amount of clicks is directly fed back for human body.
Summary of the invention
The present invention mainly solve body sense technology space nature gesture identification in accurate deficiency and traditional data gloves affect the posture, the position judgment error problem that cause by sensor.The present invention is in conjunction with above two kinds of technology, develop a kind of novel data gloves, eliminated by the cumulative errors effect of identification to the acceleration transducer of data glove of body sense technology to significantly gesture, thus support the high-precision three-dimensional modeling based on natural gesture.
The object of the invention is to be achieved through the following technical solutions: a kind ofly use nature gesture to carry out the method for accurate three-dimensional modeling based on data glove, the back of the hand center fixed L ilyPad Arduino of described data glove, the bend sensor of five finger place difference fixed length bar shapeds, pin one end of each bend sensor connects the GND mouth of LilyPad Arduino, the other end connects the simulation mouth of LilyPad Arduino, be connected with between simulation mouth the divider resistance that resistance is 10K Ω at the 5V mouth of LilyPad Arduino, make to simulate the analogue value that the reading of mouth is the voltage on bend sensor, the size of the analogue value reflects the degree of crook of finger,
At nameless finger tip, middle fingertip, index finger tip, forefinger second joint fixes FSR402 type resistance-type diaphragm pressure sensor respectively near thumb side on one side, pin one end of each FSR402 type resistance-type diaphragm pressure sensor connects the GND mouth of LilyPad Arduino, the other end connects the digital mouth of LilyPad Arduino, simultaneously LilyPad Arduino 5V mouth be connected the divider resistance that resistance is 20K Ω between digital mouth, make the reading of digital mouth be the analogue value of the voltage of FSR402 type resistance-type diaphragm pressure sensor two-value judge, when FSR402 type resistance-type diaphragm pressure sensor is not extruded, the reading of numeral mouth is 0, when FSR402 type resistance-type diaphragm pressure sensor is squeezed, the reading of digital mouth is 1,
JY-901 module is fixed in position rearward, described JY-901 module integration accelerometer, gyroscope and geomagnetic field sensors at the back of the hand; The 5V mouth of described JY-901 module, GND mouth, SCL mouth, SDA mouth connect 5V mouth, GND mouth, SCL mouth, the SDA mouth of LilyPad Arduino respectively; Between the SCL mouth and the 5V mouth of LilyPad Arduino of JY-901 module, be connected the pull-up resistor that resistance is 4.7K Ω, between the SDA mouth and the 5V mouth of LilyPad Arduino of JY-901 module, be also connected the pull-up resistor that resistance is 4.7K Ω; Use iic bus agreement to communicate between LilyPad Arduino with JY-901 module, JY-901 module can measure the dynamic Eulerian angle of palm;
Be connected, thus read the data of LilyPad Arduino with the 5V mouth of LilyPad Arduino, GND mouth, D5 mouth, D6 mouth at the 5V mouth of palm centre of the palm place fixed L ilyPad XBee, described LilyPad XBee, GND mouth, rx mouth, tx mouth respectively; Described LilyPad XBee installs the XBee 1mW Wire Antenna wireless module of 2.4G additional, described XBee 1mW Wire Antenna wireless module adopts 802.15.4 protocol stack, two-way communication is carried out by serial ports and computing machine, the data of the LilyPad Arduino received are transferred to serial ports of computers, and computer terminal carries out read operation by serial ports storehouse;
At the back of the hand center, EB-Vibrator vibrations motor is fixed in position rearward, EB-Vibrator shakes the 5V mouth of motor, GND mouth and SIG mouth and is connected the 5V mouth of LilyPad Arduino, GND mouth and D9 mouth respectively, and the PWM output function of D9 mouth can regulate EB-Vibrator to shake the shockproofness of motor.
The method specifically comprises the following steps:
(1) Kinect device that Microsoft produces is connected computing machine, the Kinect API using official of Microsoft to provide obtains the dynamic 3-axis acceleration of hand three dimensional space coordinate and hand exercise;
(2) collection of data and mark: first for each gesture is numbered, serial ports is utilized to read and write the digital flexion analogue value of bend sensor in storehouse reading data glove, the dynamic Eulerian angle that JY-901 module obtains, the acceleration of the hand three dimensional space coordinate that the digital mouth reading of FSR402 type resistance-type diaphragm pressure sensor and Kinect device obtain and hand exercise; For avoiding reading in too much repeating data, setting data reads and is spaced apart 20 seconds; After completing the collection of all data, create Excel form train.csv, data are write train.csv; Annotate portions adopts manual mark, opens train.csv form, is the numbering that each data sample puts on gesture at the end of often going; The collection of data can adopt Visual Studio platform to realize;
(3) data utilizing step 2 to gather train one based on the gesture classification device of Random Forest model: described gesture classification device is one and props up by n the tree-structure network that decision tree forms, it exports and adopts simple majority ballot method, and namely output n props up in decision tree and occurs maximum classes; Be different from general decision tree, in random forest, every decision tree does not utilize all features of data point, but from all features, gets m randomly generate decision tree; Gesture classification device uses gini index as the tolerance of segmentation effect, namely during each decision tree division, choose likely gini index reckling in splitting method; When the data in leaf node are all same gesture classification, or when the data that comprise of leaf node are less than 2, stop division; In this model, beta pruning is not carried out to decision tree;
(4) the random forest sorter in Scikit-learn machine learning storehouse is used: two parameter n that sorter is set, m is respectively 500 and 3, then the digital flexion analogue value of the bend sensor in data glove is read in, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise is trained, and the training process of sorter completes at this point;
(5) identification of gesture command: for new data point, gesture classification device will carry out level judgement to it, be specially: first see in 4 scale-of-two clicks value whether have data, if there are data, then the position occurred according to data can judge the gesture of user; If 4 all countless certificates of scale-of-two clicks value, then by the digital flexion analogue value according to the bend sensor on gloves, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise reads in the Random Forest model of having trained; After data are read in by Random Forest model, according to 3 features used during every tree training, the individual features of data is read in tree; Data, again according to the splitting condition of node, are successively classified to corresponding leaf node, and export respective class by every tree; After 500 class decision trees all obtain respective Output rusults, count who gets the most votes's class, and using this type of corresponding gesture as user data;
(6) gesture is translated as the order of 3 d modeling software, after obtaining corresponding gesture by gesture classification device, the order of correspondence is inputted 3 d modeling software by script, 3 d modeling software makes corresponding model according to the order obtained;
(7) by the digital flexion analogue value of the bend sensor in data glove, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise as Parameter transfer to 3 d modeling software, with the parameter such as translation, convergent-divergent, rotation of setting procedure 6 gained model, thus realize the accurate three-dimensional modeling based on natural gesture.
The invention has the beneficial effects as follows, when using natural language and computing machine to carry out mutual, close to the precision of mouse, can avoid the problems such as gesture is blocked, space erect-position simultaneously, thus being applied in well in three-dimensional modeling.
Accompanying drawing explanation
Fig. 1 is circuit theory diagrams of the present invention;
Fig. 2 is the element distribution plan of gloves material object;
In figure, 1 is LilyPad Arduino; 2 is LilyPad XBee; 3 is XBee 1mW Wire Antenna wireless module; 4 is bend sensor; 5 is FSR402 type resistance-type diaphragm pressure sensor; 6 is JY-901 module; 7 is vibrations motor; 8 is 5V battery; 9 is 10K Ω resistance; 10 is 20K Ω resistance; 11 is the rx mouth of LilyPad XBee; 12 is the tx mouth of LilyPad XBee; 13 is the SDA mouth of LilyPad Arduino; 14 is the SCL mouth of LilyPad Arduino; 15 is 4.7K Ω resistance.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention is described in further detail.
As shown in Figure 1, the back of the hand center fixed L ilyPad Arduino 1 of data glove of the present invention, the bend sensor 4 of five finger place difference fixed length bar shapeds, pin one end of each bend sensor 4 connects the GND mouth of LilyPad Arduino, the other end connects the simulation mouth of LilyPad Arduino, be connected with between simulation mouth the divider resistance 9 that resistance is 10K Ω at the 5V mouth of LilyPad Arduino, make to simulate the analogue value that the reading of mouth is the voltage on bend sensor 4, the size of the analogue value reflects the degree of crook of finger;
At nameless finger tip, middle fingertip, index finger tip, forefinger second joint fixes FSR402 type resistance-type diaphragm pressure sensor 5 respectively near thumb side on one side, pin one end of each FSR402 type resistance-type diaphragm pressure sensor 5 connects the GND mouth of LilyPad Arduino, the other end connects the digital mouth of LilyPad Arduino, simultaneously LilyPad Arduino 5V mouth be connected the divider resistance 10 that resistance is 20K Ω between digital mouth, make the reading of digital mouth be the analogue value of the voltage of FSR402 type resistance-type diaphragm pressure sensor 5 two-value judge, when FSR402 type resistance-type diaphragm pressure sensor 5 is not extruded, the reading of numeral mouth is 0, when FSR402 type resistance-type diaphragm pressure sensor 5 is squeezed, the reading of digital mouth is 1,
JY-901 module 6 is fixed in position rearward, described JY-901 module 6 integrated accelerometer, gyroscope and geomagnetic field sensors at the back of the hand; The 5V mouth of described JY-901 module 6, GND mouth, SCL mouth, SDA mouth connect 5V mouth, GND mouth, SCL mouth 14, the SDA mouth 13 of LilyPad Arduino respectively; Between the SCL mouth and the 5V mouth of LilyPad Arduino of JY-901 module 6, be connected the pull-up resistor 15 that resistance is 4.7K Ω, between the SDA mouth and the 5V mouth of LilyPad Arduino of JY-901 module 6, be also connected the pull-up resistor 15 that resistance is 4.7K Ω; Use iic bus agreement to communicate between LilyPad Arduino with JY-901 module 6, JY-901 module 6 can measure the dynamic Eulerian angle of palm;
At palm centre of the palm place fixed L ilyPad XBee 2, the 5V mouth of described LilyPad XBee 2, GND mouth, rx mouth 11, tx mouth 12 are connected with the 5V mouth of LilyPad Arduino 1, GND mouth, D5 mouth, D6 mouth, thus read the data of LilyPad Arduino 1 respectively; Described LilyPad XBee 2 installs the XBee 1mW Wire Antenna wireless module 3 of 2.4G additional, described XBee 1mW Wire Antenna wireless module 3 adopts 802.15.4 protocol stack, two-way communication is carried out by serial ports and computing machine, the data of the LilyPad Arduino 1 received are transferred to serial ports of computers, and computer terminal carries out read operation by serial ports storehouse;
At the back of the hand center, EB-Vibrator vibrations motor 7 is fixed in position rearward, EB-Vibrator shakes the 5V mouth of motor 7, GND mouth and SIG mouth and is connected the 5V mouth of LilyPad Arduino, GND mouth and D9 mouth respectively, and the PWM output function of D9 mouth can regulate EB-Vibrator to shake the shockproofness of motor 7.
Fig. 2 is the element distribution plan of gloves material object.Wherein left figure is the element distribution situation in palm of the hand face, and right figure is the element distribution situation in the back of the hand face.It is noted that the side of hand crossed by four wires at A place in figure; In figure, B is forefinger second articulations digitorum manus, and its side is fixed with FSR402 type resistance-type diaphragm pressure sensor.
The present invention is a kind of uses nature gesture to carry out the method for accurate three-dimensional modeling based on data glove, specifically comprises the following steps:
(1) Kinect device that Microsoft produces is connected computing machine, the Kinect API using official of Microsoft to provide obtains the dynamic 3-axis acceleration of hand three dimensional space coordinate and hand exercise;
(2) collection of data and mark: first for each gesture is numbered, serial ports is utilized to read and write the digital flexion analogue value of bend sensor in storehouse reading data glove, the dynamic Eulerian angle that JY-901 module obtains, the acceleration of the hand three dimensional space coordinate that the digital mouth reading of FSR402 type resistance-type diaphragm pressure sensor and Kinect device obtain and hand exercise; For avoiding reading in too much repeating data, setting data reads and is spaced apart 20 seconds; After completing the collection of all data, create Excel form train.csv, data are write train.csv; Annotate portions adopts manual mark, opens train.csv form, is the numbering that each data sample puts on gesture at the end of often going; The collection of data can adopt Visual Studio platform to realize;
(3) data utilizing step 2 to gather train one based on the gesture classification device of Random Forest model: described gesture classification device is one and props up by n the tree-structure network that decision tree forms, it exports and adopts simple majority ballot method, and namely output n props up in decision tree and occurs maximum classes; Be different from general decision tree, in random forest, every decision tree does not utilize all features of data point, but from all features, gets m randomly generate decision tree; Gesture classification device uses gini index as the tolerance of segmentation effect, namely during each decision tree division, choose likely gini index reckling in splitting method; When the data in leaf node are all same gesture classification, or when the data that comprise of leaf node are less than 2, stop division; In this model, beta pruning is not carried out to decision tree;
(4) the random forest sorter in Scikit-learn machine learning storehouse is used: two parameter n that sorter is set, m is respectively 500 and 3, then the digital flexion analogue value of the bend sensor in data glove is read in, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise is trained, and the training process of sorter completes at this point;
(5) identification of gesture command: for new data point, gesture classification device will carry out level judgement to it, be specially: first see in 4 scale-of-two clicks value whether have data, if there are data, then the position occurred according to data can judge the gesture of user; If 4 all countless certificates of scale-of-two clicks value, then by the digital flexion analogue value according to the bend sensor on gloves, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise reads in the Random Forest model of having trained; After data are read in by Random Forest model, according to 3 features used during every tree training, the individual features of data is read in tree; Data, again according to the splitting condition of node, are successively classified to corresponding leaf node, and export respective class by every tree; After 500 class decision trees all obtain respective Output rusults, count who gets the most votes's class, and using this type of corresponding gesture as user data;
(6) gesture is translated as the order of 3 d modeling software, after obtaining corresponding gesture by gesture classification device, the order of correspondence is inputted 3 d modeling software by script, 3 d modeling software makes corresponding model according to the order obtained;
(7) by the digital flexion analogue value of the bend sensor in data glove, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise as Parameter transfer to 3 d modeling software, with the parameter such as translation, convergent-divergent, rotation of setting procedure 6 gained model, thus realize the accurate three-dimensional modeling based on natural gesture.

Claims (1)

1. one kind uses nature gesture to carry out the method for accurate three-dimensional modeling based on data glove, it is characterized in that, the back of the hand center fixed L ilyPad Arduino of described data glove, the bend sensor of five finger place difference fixed length bar shapeds, pin one end of each bend sensor connects the GND mouth of LilyPad Arduino, the other end connects the simulation mouth of LilyPad Arduino, is connected the divider resistance that resistance is 10K Ω at the 5V mouth of LilyPad Arduino with between simulation mouth;
At nameless finger tip, middle fingertip, index finger tip, forefinger second joint fixes FSR402 type resistance-type diaphragm pressure sensor respectively near thumb side on one side, pin one end of each FSR402 type resistance-type diaphragm pressure sensor connects the GND mouth of LilyPad Arduino, the other end connects the digital mouth of LilyPad Arduino, simultaneously LilyPad Arduino 5V mouth be connected the divider resistance that resistance is 20K Ω between digital mouth, make the reading of digital mouth be the analogue value of the voltage of FSR402 type resistance-type diaphragm pressure sensor two-value judge, when FSR402 type resistance-type diaphragm pressure sensor is not extruded, the reading of numeral mouth is 0, when FSR402 type resistance-type diaphragm pressure sensor is squeezed, the reading of digital mouth is 1,
JY-901 module is fixed in position rearward, described JY-901 module integration accelerometer, gyroscope and geomagnetic field sensors at the back of the hand; The 5V mouth of described JY-901 module, GND mouth, SCL mouth, SDA mouth connect 5V mouth, GND mouth, SCL mouth, the SDA mouth of LilyPad Arduino respectively; Between the SCL mouth and the 5V mouth of LilyPad Arduino of JY-901 module, be connected the pull-up resistor that resistance is 4.7K Ω, between the SDA mouth and the 5V mouth of LilyPad Arduino of JY-901 module, be also connected the pull-up resistor that resistance is 4.7K Ω; Use iic bus agreement to communicate between LilyPad Arduino with JY-901 module, JY-901 module can measure the dynamic Eulerian angle of palm;
Be connected, thus read the data of LilyPad Arduino with the 5V mouth of LilyPad Arduino, GND mouth, D5 mouth, D6 mouth at the 5V mouth of palm centre of the palm place fixed L ilyPad XBee, described LilyPad XBee, GND mouth, rx mouth, tx mouth respectively; Described LilyPad XBee installs the XBee 1mW Wire Antenna wireless module of 2.4G additional, described XBee 1mW Wire Antenna wireless module adopts 802.15.4 protocol stack, two-way communication is carried out by serial ports and computing machine, the data of the LilyPad Arduino received are transferred to serial ports of computers, and computer terminal carries out read operation by serial ports storehouse;
At the back of the hand center, EB-Vibrator vibrations motor is fixed in position rearward, EB-Vibrator shakes the 5V mouth of motor, GND mouth and SIG mouth and is connected the 5V mouth of LilyPad Arduino, GND mouth and D9 mouth respectively, and the PWM output function of D9 mouth can regulate EB-Vibrator to shake the shockproofness of motor;
The method specifically comprises the following steps:
(1) Kinect device that Microsoft produces is connected computing machine, the Kinect API using official of Microsoft to provide obtains the dynamic 3-axis acceleration of hand three dimensional space coordinate and hand exercise;
(2) collection of data and mark: first for each gesture is numbered, serial ports is utilized to read and write the digital flexion analogue value of bend sensor in storehouse reading data glove, the dynamic Eulerian angle that JY-901 module obtains, the acceleration of the hand three dimensional space coordinate that the digital mouth reading of FSR402 type resistance-type diaphragm pressure sensor and Kinect device obtain and hand exercise; For avoiding reading in too much repeating data, setting data reads and is spaced apart 20 seconds; After completing the collection of all data, create Excel form train.csv, data are write train.csv; Annotate portions adopts manual mark, opens train.csv form, is the numbering that each data sample puts on gesture at the end of often going; The collection of data can adopt Visual Studio platform to realize;
(3) data utilizing step 2 to gather train one based on the gesture classification device of Random Forest model: described gesture classification device is one and props up by n the tree-structure network that decision tree forms, it exports and adopts simple majority ballot method, and namely output n props up in decision tree and occurs maximum classes; Be different from general decision tree, in random forest, every decision tree does not utilize all features of data point, but from all features, gets m randomly generate decision tree; Gesture classification device uses gini index as the tolerance of segmentation effect, namely during each decision tree division, choose likely gini index reckling in splitting method; When the data in leaf node are all same gesture classification, or when the data that comprise of leaf node are less than 2, stop division; In this model, beta pruning is not carried out to decision tree;
(4) the random forest sorter in Scikit-learn machine learning storehouse is used: two parameter n that sorter is set, m is respectively 500 and 3, then the digital flexion analogue value of the bend sensor in data glove is read in, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise is trained, and the training process of sorter completes at this point;
(5) identification of gesture command: for new data point, gesture classification device will carry out level judgement to it, be specially: first see in 4 scale-of-two clicks value whether have data, if there are data, then the position occurred according to data can judge the gesture of user; If 4 all countless certificates of scale-of-two clicks value, then by the digital flexion analogue value according to the bend sensor on gloves, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise reads in the Random Forest model of having trained; After data are read in by Random Forest model, according to 3 features used during every tree training, the individual features of data is read in tree; Data, again according to the splitting condition of node, are successively classified to corresponding leaf node, and export respective class by every tree; After 500 class decision trees all obtain respective Output rusults, count who gets the most votes's class, and using this type of corresponding gesture as user data;
(6) gesture is translated as the order of 3 d modeling software, after obtaining corresponding gesture by gesture classification device, the order of correspondence is inputted 3 d modeling software by script, 3 d modeling software makes corresponding model according to the order obtained;
(7) by the digital flexion analogue value of the bend sensor in data glove, the acceleration that the dynamic Eulerian angle that JY-901 module obtains and Kinect device obtain hand three dimensional space coordinate and hand exercise as Parameter transfer to 3 d modeling software, with the parameter such as translation, convergent-divergent, rotation of setting procedure 6 gained model, thus realize the accurate three-dimensional modeling based on natural gesture.
CN201510114915.XA 2015-03-16 2015-03-16 A kind of method for carrying out accurate three-dimensional modeling using natural gesture based on data glove Active CN104778746B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510114915.XA CN104778746B (en) 2015-03-16 2015-03-16 A kind of method for carrying out accurate three-dimensional modeling using natural gesture based on data glove

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510114915.XA CN104778746B (en) 2015-03-16 2015-03-16 A kind of method for carrying out accurate three-dimensional modeling using natural gesture based on data glove

Publications (2)

Publication Number Publication Date
CN104778746A true CN104778746A (en) 2015-07-15
CN104778746B CN104778746B (en) 2017-06-16

Family

ID=53620189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510114915.XA Active CN104778746B (en) 2015-03-16 2015-03-16 A kind of method for carrying out accurate three-dimensional modeling using natural gesture based on data glove

Country Status (1)

Country Link
CN (1) CN104778746B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105242788A (en) * 2015-10-29 2016-01-13 华侨大学 Bending sensor-based wireless data glove circuit layout and sensor configuration method
CN105404390A (en) * 2015-10-29 2016-03-16 华侨大学 Modeling and gesture action identification method of wireless data glove
CN105666497A (en) * 2016-04-21 2016-06-15 奇弩(北京)科技有限公司 Universal mechanical arm with gesture learning function
CN105769343A (en) * 2016-04-26 2016-07-20 中国科学院自动化研究所 Vascular intervention operation collection device and vascular intervention operation collection method
CN106023308A (en) * 2016-05-31 2016-10-12 东南大学 Somatosensory interaction rapid three-dimensional modeling auxiliary system and method thereof
CN106354415A (en) * 2016-10-08 2017-01-25 努比亚技术有限公司 Terminal and method for recognizing user gesture thereof
CN106681500A (en) * 2016-12-14 2017-05-17 天津雅达电子商务有限公司 Clicking identification system and method in man-machine interaction device in electronic information field
CN107358210A (en) * 2017-07-17 2017-11-17 广州中医药大学 Human motion recognition method and device
CN108010577A (en) * 2016-10-28 2018-05-08 西门子保健有限责任公司 Operate assistant
CN110362195A (en) * 2019-06-10 2019-10-22 东南大学 Gesture identification and interactive system based on bistable state coding and Flexiable angular transducer
CN110716644A (en) * 2019-10-11 2020-01-21 安徽建筑大学 Tactile feedback glove and VR (virtual reality) equipment assembly with same
CN110991319A (en) * 2019-11-29 2020-04-10 广州市百果园信息技术有限公司 Hand key point detection method, gesture recognition method and related device
CN112233546A (en) * 2020-10-27 2021-01-15 广西师范大学 Concept presentation device for traditional aesthetics
CN113223344A (en) * 2021-05-25 2021-08-06 湖南汽车工程职业学院 Big data-based professional teaching display system for art design
CN113238661A (en) * 2021-07-09 2021-08-10 呜啦啦(广州)科技有限公司 Data processing method and system for data glove, electronic equipment and medium
CN113496168A (en) * 2020-04-02 2021-10-12 百度在线网络技术(北京)有限公司 Sign language data acquisition method, sign language data acquisition equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040120581A1 (en) * 2002-08-27 2004-06-24 Ozer I. Burak Method and apparatus for automated video activity analysis
CN101694692A (en) * 2009-10-22 2010-04-14 浙江大学 Gesture identification method based on acceleration transducer
US20120057779A1 (en) * 2010-09-02 2012-03-08 Edge 3 Technologies, Inc. Method and Apparatus for Confusion Learning
CN103914149A (en) * 2014-04-01 2014-07-09 复旦大学 Gesture interaction method and gesture interaction system for interactive television
CN103971102A (en) * 2014-05-21 2014-08-06 南京大学 Static gesture recognition method based on finger contour and decision-making trees

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040120581A1 (en) * 2002-08-27 2004-06-24 Ozer I. Burak Method and apparatus for automated video activity analysis
CN101694692A (en) * 2009-10-22 2010-04-14 浙江大学 Gesture identification method based on acceleration transducer
US20120057779A1 (en) * 2010-09-02 2012-03-08 Edge 3 Technologies, Inc. Method and Apparatus for Confusion Learning
CN103914149A (en) * 2014-04-01 2014-07-09 复旦大学 Gesture interaction method and gesture interaction system for interactive television
CN103971102A (en) * 2014-05-21 2014-08-06 南京大学 Static gesture recognition method based on finger contour and decision-making trees

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
TIBOR LUKIC 等: "Regularized image denoising based on spectral gradient optimization", 《INVERSE PROBLEMS》 *
赵显: "基于随机森林的手势检测与识别系统的研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
陈喆: "手势跟踪、检测及识别技术研究与系统实现", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
高扬: "基于Kinect的手势图像分类器涉及", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404390A (en) * 2015-10-29 2016-03-16 华侨大学 Modeling and gesture action identification method of wireless data glove
CN105242788A (en) * 2015-10-29 2016-01-13 华侨大学 Bending sensor-based wireless data glove circuit layout and sensor configuration method
CN105242788B (en) * 2015-10-29 2018-10-16 华侨大学 A kind of wireless data gloves wiring and Way of Sensor Deployment based on bending sensor
CN105404390B (en) * 2015-10-29 2018-07-20 华侨大学 A kind of modeling of wireless data gloves and gesture motion recognition methods
CN105666497A (en) * 2016-04-21 2016-06-15 奇弩(北京)科技有限公司 Universal mechanical arm with gesture learning function
CN105769343B (en) * 2016-04-26 2018-05-18 中国科学院自动化研究所 A kind of blood vessel intervention operation operation harvester and method
CN105769343A (en) * 2016-04-26 2016-07-20 中国科学院自动化研究所 Vascular intervention operation collection device and vascular intervention operation collection method
CN106023308A (en) * 2016-05-31 2016-10-12 东南大学 Somatosensory interaction rapid three-dimensional modeling auxiliary system and method thereof
CN106354415A (en) * 2016-10-08 2017-01-25 努比亚技术有限公司 Terminal and method for recognizing user gesture thereof
CN106354415B (en) * 2016-10-08 2020-05-26 瑞安市辉煌网络科技有限公司 Terminal and method for recognizing user gesture
CN108010577B (en) * 2016-10-28 2022-08-16 西门子保健有限责任公司 Operation assistant
CN108010577A (en) * 2016-10-28 2018-05-08 西门子保健有限责任公司 Operate assistant
CN106681500A (en) * 2016-12-14 2017-05-17 天津雅达电子商务有限公司 Clicking identification system and method in man-machine interaction device in electronic information field
CN106681500B (en) * 2016-12-14 2019-12-24 北京康吉森技术有限公司 Click recognition system and method in man-machine interaction equipment in electronic information field
CN107358210A (en) * 2017-07-17 2017-11-17 广州中医药大学 Human motion recognition method and device
CN107358210B (en) * 2017-07-17 2020-05-15 广州中医药大学 Human body action recognition method and device
CN110362195A (en) * 2019-06-10 2019-10-22 东南大学 Gesture identification and interactive system based on bistable state coding and Flexiable angular transducer
CN110716644A (en) * 2019-10-11 2020-01-21 安徽建筑大学 Tactile feedback glove and VR (virtual reality) equipment assembly with same
CN110991319A (en) * 2019-11-29 2020-04-10 广州市百果园信息技术有限公司 Hand key point detection method, gesture recognition method and related device
CN113496168A (en) * 2020-04-02 2021-10-12 百度在线网络技术(北京)有限公司 Sign language data acquisition method, sign language data acquisition equipment and storage medium
CN112233546A (en) * 2020-10-27 2021-01-15 广西师范大学 Concept presentation device for traditional aesthetics
CN113223344A (en) * 2021-05-25 2021-08-06 湖南汽车工程职业学院 Big data-based professional teaching display system for art design
CN113238661A (en) * 2021-07-09 2021-08-10 呜啦啦(广州)科技有限公司 Data processing method and system for data glove, electronic equipment and medium

Also Published As

Publication number Publication date
CN104778746B (en) 2017-06-16

Similar Documents

Publication Publication Date Title
CN104778746A (en) Method for performing accurate three-dimensional modeling based on data glove by using natural gestures
CN106445130B (en) A kind of motion capture gloves and its calibration method for gesture identification
CN102824176B (en) Upper limb joint movement degree measuring method based on Kinect sensor
CN103226398B (en) Based on the data glove of micro-inertia sensor network technology
Bukhari et al. American sign language translation through sensory glove; signspeak
CN110476168A (en) Method and system for hand tracking
CN108268129A (en) The method and apparatus and motion capture gloves calibrated to multiple sensors on motion capture gloves
CN108182433A (en) A kind of meter reading recognition methods and system
Ahmed et al. Based on wearable sensory device in 3D-printed humanoid: A new real-time sign language recognition system
CN107920783A (en) The system and method for monitoring and the movement and alignment mode of user's body activity association
CN105446485B (en) System and method is caught based on data glove and the human hand movement function of position tracking instrument
Banos et al. Kinect= imu? learning mimo signal mappings to automatically translate activity recognition systems across sensor modalities
US10065111B1 (en) Mapping user interactions with a controller to a hand position
Hilman et al. Virtual hand: VR hand controller using IMU and flex sensor
CN206378818U (en) A kind of Hand gesture detection device based on wireless self-networking pattern
CN101762252A (en) Virtual three-coordinate measuring instrument and use method thereof
CN109147057A (en) A kind of virtual hand collision checking method towards wearable haptic apparatus
CN206270980U (en) A kind of motion capture gloves for gesture identification
CN109099827B (en) Method for detecting posture of pen body through capacitance and electromagnetic positioning double sensors
Bal et al. Dynamic hand gesture pattern recognition using probabilistic neural network
CN112926550A (en) Human-computer interaction method and device based on three-dimensional image human body posture matching
Vicente et al. Calibration of kinematic body sensor networks: Kinect-based gauging of data gloves “in the wild”
KR20230147144A (en) Methods for improving markerless motion analysis
CN108072371A (en) Localization method, positioner and electronic equipment
CN107704087B (en) Data glove calibration method based on joint correlation analysis

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant