CN104700403B - A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support - Google Patents
A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support Download PDFInfo
- Publication number
- CN104700403B CN104700403B CN201510071854.3A CN201510071854A CN104700403B CN 104700403 B CN104700403 B CN 104700403B CN 201510071854 A CN201510071854 A CN 201510071854A CN 104700403 B CN104700403 B CN 104700403B
- Authority
- CN
- China
- Prior art keywords
- module
- joint
- kinect
- hydraulic support
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 230000008569 process Effects 0.000 claims abstract description 10
- 230000009471 action Effects 0.000 claims description 15
- 210000000323 shoulder joint Anatomy 0.000 claims description 11
- 210000002310 elbow joint Anatomy 0.000 claims description 8
- 239000000284 extract Substances 0.000 claims description 4
- 230000008859 change Effects 0.000 claims description 3
- 238000001514 detection method Methods 0.000 claims description 3
- 230000013011 mating Effects 0.000 claims description 3
- 240000007594 Oryza sativa Species 0.000 claims 1
- 235000007164 Oryza sativa Nutrition 0.000 claims 1
- 235000009566 rice Nutrition 0.000 claims 1
- 230000000875 corresponding effect Effects 0.000 abstract description 11
- 230000003993 interaction Effects 0.000 description 4
- 210000003797 carpal joint Anatomy 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000005065 mining Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 239000003245 coal Substances 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000000630 rising effect Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a kind of gesture based on kinect and control virtual teaching system and the method for hydraulic support, including kinect module, depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module, kinect module is connected with depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module successively.Operator are only needed to make certain gestures, it is possible to observe virtual hydraulic support by display apparatus module and make corresponding action, it is simple to people are better understood from the work process of hydraulic support.
Description
Technical field
The present invention relates to a kind of gesture based on kinect and control virtual teaching system and the method for hydraulic support, belong to automatic control technology field.
Background technology
Human-computer interaction technology refers to the input and output device by computer, it is achieved people exchanges with computer.Common man-machine interaction mode is the system using the position to staff and pose to carry out real-time tracking, this kind equipment has electromagnetic tracking device, inertial sensor and data glove etc., these belong to the sensor of contact, need to dress extra equipment and make interaction the most natural.And the body feeling interaction equipment kinect of Microsoft is contactless sensor, computer can be controlled by the hand motion of operator, it is not necessary to dress extra equipment on the person so that people can interact with a computer in a natural manner.
Hydraulic support is used to control the works of coal-face mine pressure, and as the necessary equipment of comprehensive mechanical coal mining, it is coal mining machine in full-mechanized mining face, drag conveyor and staff provide safe work space.But people are difficult to the work process of the study understanding hydraulic support of image.In order to improve people's interactive experience sense to hydraulic support work process, need to design the intersection control routine of a kind of people-hydraulic support virtual three-dimensional environment, make people be better understood from the work process of hydraulic support.
Summary of the invention
The problem existed for above-mentioned prior art, the present invention provides virtual teaching system and the method for a kind of gesture based on kinect control hydraulic support, operator are only needed to make certain gestures, just can observe virtual hydraulic support by display apparatus module and make corresponding action, it is simple to people are better understood from the work process of hydraulic support.
To achieve these goals, the technical solution used in the present invention is: this kind gesture based on kinect controls the virtual teaching system of hydraulic support, including kinect module, depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module, kinect module is connected with depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module successively.
This kind gesture based on kinect controls the Virtual Demonstration method of hydraulic support, and it comprises the concrete steps that:
(1) preset eight different standard gestures, and eight gestures are stored in control module;
(2) control module is by the fall post of eight corresponding hydraulic supports of standard gestures difference, shifting frame, setting prop, pushing and sliding, fall face guard, liter face guard, receipts front-axle beam and the control signal rising eight actions of front-axle beam;
(3) initializing kinect module, set the video camera elevation angle, standing stands between distance 1.2 meters to 3.5 meters scopes of photographic head, one of eight standard gestures making setting;
(4) action of kinect module detection operator, and from color stream, deep stream and skeleton stream, obtain skeleton information data frame;
(5) the skeleton information data frame that depth of field data acquisition module analyzing and processing kinect module transmits, obtain image depth data, extract skeleton information, by setting up the 3D coordinate system of human hand and arm joint, obtain human hand and arm joint node coordinate, the position different to identify human body;
(6) articulare message processing module is by the 3D coordinate of obtained human hand and arm joint, calculates the rotational angle on human body shoulder joint three degree of freedom direction, the rotational angle on two degree of freedom directions of elbow joint;According to human body shoulder joint and elbow joint rotational angle information, identify the rotation information of human arm skeleton node, carry out data process by catching the change of different skeletal joint point angle;
(7) each above-mentioned different joint angles information is carried out characteristic matching with each joint angles information of eight standard gestures of storage in control module by gesture feature matching module;If mating unsuccessful, then returning step (3), if the match is successful, then gesture feature matching module passes data to control module;
(8) control module receives gesture feature matching module and transmits the data that the match is successful, after analyzing and processing, send the action control signal of the hydraulic support corresponding with this gesture, and then control three-dimensional hydraulic support action, and shown by display apparatus module.
Compared with prior art, the present invention carries out the collection of gesture information by kinect module, utilization can extract the position in the three-dimensional information tracking each joint of arm of skeleton point, by recognizing specific gesture, control virtual hydraulic support and make corresponding action, it is simple to people are better understood from the work process of hydraulic support.
Accompanying drawing explanation
Fig. 1 is the electric theory diagram of the present invention;
Human arm motion's schematic diagram when Fig. 2 is to make gesture in the present invention;
Fig. 3 is the D-H coordinate system of arm in the present invention;
Fig. 4 is the flow chart of the present invention.
Detailed description of the invention
The invention will be further described below in conjunction with the accompanying drawings.
As depicted in figs. 1 and 2, this kind gesture based on kinect controls the virtual teaching system of hydraulic support, including kinect module, depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module, kinect module is connected with depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module successively.
This kind gesture based on kinect controls the Virtual Demonstration method of hydraulic support, and it comprises the concrete steps that:
(1) preset eight different standard gestures, and eight gestures are stored in control module;
(2) control module is by the fall post of eight corresponding hydraulic supports of standard gestures difference, shifting frame, setting prop, pushing and sliding, fall face guard, liter face guard, receipts front-axle beam and the control signal rising eight actions of front-axle beam;
(3) initializing kinect module, set the video camera elevation angle, standing stands between distance 1.2 meters to 3.5 meters scopes of photographic head, one of eight standard gestures making setting;
(4) action of kinect module detection operator, and from color stream, deep stream and skeleton stream, obtain skeleton information data frame;
(5) the skeleton information data frame that depth of field data acquisition module analyzing and processing kinect module transmits, obtain image depth data, extract skeleton information, by setting up the 3D coordinate system of human hand and arm joint, obtain human hand and arm joint node coordinate, the position different to identify human body;
(6) articulare message processing module is by the 3D coordinate of obtained human hand and arm joint, calculates the rotational angle on human body shoulder joint three degree of freedom direction, the rotational angle on two degree of freedom directions of elbow joint;According to human body shoulder joint and elbow joint rotational angle information, identifying the rotation information of human arm skeleton node, carry out data process by catching the change of different skeletal joint point angle, concrete process is:
As it is shown on figure 3, the motion of human body right arm shoulder joint to be divided into flexion and extension, telescopic motion, rotary motion, respectively corresponding angle α1、β1、γ1.Human elbow motion is flexion and extension, and corresponding angle is δ1.Frame of reference o is set up at the intermediate point of human body both shoulders joint0-x0y0z0If the 3D coordinate of human body right arm shoulder joint skeleton node is (x1,y1,z1), the 3D coordinate of right arm elbow joint skeleton node is (x2,y2,z2), the 3D coordinate of right arm carpal joint skeleton node is (x3,y3,z3);
Right arm shoulder breadth
The big brachium of right arm
The little brachium of right arm
For shoulder joint flexor ,-40 ° of < α1< 90 °,
For shoulder exhibition receipts action ,-90 ° of < β1< 20 °,
For elbow joint flexor, 0 ° of < δ1< 130 °,
For shoulder joint rotary motion, can not directly be obtained by known coordinate.Here with setting up human synovial D-H coordinate system, draw the coordinate expressions of carpal joint node according to principle of coordinate transformation, then release γ according to known carpal joint node coordinate is counter1。
According to the D-H parameter in table 1, write out the affine transformation matrix that each joint is corresponding.
Obtaining total transformation matrix is
Wherein
x3=px=L1cosθ2sinθ3+L2cosθ5(sinθ2sinθ4+cosθ2cosθ3cosθ4)+L2cosθ2sinθ3sinθ5
y3=px=L1sinθ2sinθ3-L2cosθ5(cosθ2sinθ4+cosθ3cosθ4sinθ2)+L2sinθ2sinθ3sinθ5
z3=L0-L1cosθ3-L2cosθ3sinθ5+L2cosθ4cosθ5sinθ3
Wherein
According to known angle and length, substitute in above-mentioned formula, the rotary motion angle γ of shoulder joint can be obtained1(-90°<γ1<45°);
(7) each above-mentioned different joint angles information is carried out characteristic matching with each joint angles information of eight standard gestures of storage in control module by gesture feature matching module;If mating unsuccessful, then returning step (3), if the match is successful and continues 5 seconds, then gesture feature matching module passes data to control module;
(8) control module receives gesture feature matching module and transmits the data that the match is successful, after analyzing and processing, send the action control signal of the hydraulic support corresponding with this gesture, and then control three-dimensional hydraulic support action, and shown by display apparatus module;Table 2 is the hydraulic pressure direct action that each different gesture feature is corresponding.
Table 1:
i | α | a | d | θ |
1 | 0 | 0 | L0 | 0 |
2 | 90 | 0 | 0 | θ2 |
3 | 90 | 0 | 0 | θ3 |
4 | 90 | 0 | L1 | θ4 |
5 | 0 | L2 | 0 | θ5 |
Table 2:
α1 | β1 | γ1 | δ1 | Action | |
1 | 70 | -70 | 0 | 90 | Fall post |
2 | 30 | 0 | -90 | 90 | Move frame |
3 | 0 | -90 | 0 | 90 | Setting prop |
4 | 0 | -90 | -90 | 90 | Pushing and sliding |
5 | 0 | -90 | 45 | 0 | Fall face guard |
6 | 90 | 0 | 45 | 90 | Rise face guard |
7 | 90 | 0 | 45 | 0 | Receive front-axle beam |
8 | 90 | -90 | 0 | 0 | Rise front-axle beam |
Claims (1)
1. the Virtual Demonstration method of gesture based on a kinect control hydraulic support, it is characterised in that its concrete steps
It is:
(1) preset eight different standard gestures, and eight gestures are stored in control module;
(2) control module is by the fall post of the most corresponding for eight standard gestures hydraulic support, shifting frame, setting prop, pushing and sliding, fall wall supporting
Plate, liter face guard, receipts front-axle beam and the control signal of liter eight actions of front-axle beam;
(3) initializing kinect module, set the video camera elevation angle, standing stands in distance photographic head 1.2 meters to 3.5
Between rice scope, one of eight standard gestures making setting;
(4) action of kinect module detection operator, and from color stream, deep stream and skeleton stream, obtain skeleton
Information data frame;
(5) the skeleton information data frame that depth of field data acquisition module analyzing and processing kinect module transmits, it is thus achieved that image
Depth of field data, extracts skeleton information, by setting up the 3D coordinate system of human hand and arm joint, obtains human hand and arm joint
Node coordinate, the position different to identify human body;
(6) articulare message processing module is by the 3D coordinate of obtained human hand and arm joint, calculates human body shoulder joint
Rotational angle on joint three degree of freedom direction, the rotational angle on two degree of freedom directions of elbow joint;According to human body shoulder joint
With elbow joint rotational angle information, identify the rotation information of human arm skeleton node, by catching different skeletal joint point
The change of angle carries out data process;
(7) gesture feature matching module is by each above-mentioned different joint angles information and eight marks of storage in control module
Each joint angles information of quasi-gesture carries out characteristic matching;If mating unsuccessful, then return step (3), if the match is successful,
Then gesture feature matching module passes data to control module;
(8) control module receives gesture feature matching module and transmits the data that the match is successful, after analyzing and processing, send with
The action control signal of the hydraulic support that this gesture is corresponding, and then control three-dimensional hydraulic support action, and by display
Device module shows.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510071854.3A CN104700403B (en) | 2015-02-11 | 2015-02-11 | A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510071854.3A CN104700403B (en) | 2015-02-11 | 2015-02-11 | A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104700403A CN104700403A (en) | 2015-06-10 |
CN104700403B true CN104700403B (en) | 2016-11-09 |
Family
ID=53347485
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510071854.3A Active CN104700403B (en) | 2015-02-11 | 2015-02-11 | A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104700403B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105500370B (en) * | 2015-12-21 | 2018-11-02 | 华中科技大学 | A kind of robot off-line teaching programing system and method based on body-sensing technology |
US20180345491A1 (en) * | 2016-01-29 | 2018-12-06 | Mitsubishi Electric Corporation | Robot teaching device, and method for generating robot control program |
CN108602190B (en) * | 2016-02-05 | 2022-02-18 | Abb瑞士股份有限公司 | Controlling an industrial robot using interactive commands |
JP6585665B2 (en) * | 2017-06-29 | 2019-10-02 | ファナック株式会社 | Virtual object display system |
JP6781201B2 (en) * | 2018-06-05 | 2020-11-04 | ファナック株式会社 | Virtual object display system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103170973A (en) * | 2013-03-28 | 2013-06-26 | 上海理工大学 | Man-machine cooperation device and method based on Kinect video camera |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102727362B (en) * | 2012-07-20 | 2014-09-24 | 上海海事大学 | NUI (Natural User Interface)-based peripheral arm motion tracking rehabilitation training system and training method |
CN104108097A (en) * | 2014-06-25 | 2014-10-22 | 陕西高华知本化工科技有限公司 | Feeding and discharging mechanical arm system based on gesture control |
-
2015
- 2015-02-11 CN CN201510071854.3A patent/CN104700403B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103170973A (en) * | 2013-03-28 | 2013-06-26 | 上海理工大学 | Man-machine cooperation device and method based on Kinect video camera |
Also Published As
Publication number | Publication date |
---|---|
CN104700403A (en) | 2015-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104700403B (en) | A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support | |
CN103135756B (en) | Generate the method and system of control instruction | |
CN101943946B (en) | Two-dimensional image force touch reproducing control method and system based on three-dimensional force sensor | |
CN103472916A (en) | Man-machine interaction method based on human body gesture recognition | |
CN107688391A (en) | A kind of gesture identification method and device based on monocular vision | |
CN102508578B (en) | Projection positioning device and method as well as interaction system and method | |
CN104570731A (en) | Uncalibrated human-computer interaction control system and method based on Kinect | |
CN104423569A (en) | Pointing position detecting device, method and computer readable recording medium | |
CN109145802B (en) | Kinect-based multi-person gesture man-machine interaction method and device | |
CN102253713A (en) | Display system orienting to three-dimensional images | |
US20130202212A1 (en) | Information processing apparatus, information processing method, and computer program | |
CN105319991A (en) | Kinect visual information-based robot environment identification and operation control method | |
CN104460951A (en) | Human-computer interaction method | |
CN205068294U (en) | Human -computer interaction of robot device | |
CN104656893A (en) | Remote interaction control system and method for physical information space | |
CN100478860C (en) | Electronic plane display positioning system and positioning method | |
CN103092437A (en) | Portable touch interactive system based on image processing technology | |
CN105513128A (en) | Kinect-based three-dimensional data fusion processing method | |
CN102841679A (en) | Non-contact man-machine interaction method and device | |
CN102479386A (en) | Three-dimensional motion tracking method of upper half part of human body based on monocular video | |
Hou et al. | Four-point trapezoidal calibration algorithm for human-computer interaction system based on 3D sensor | |
CN110142769B (en) | ROS platform online mechanical arm demonstration system based on human body posture recognition | |
CN203386146U (en) | Infrared video positioning-based man-machine interactive device | |
CN105373329A (en) | Interactive method and system for display and booth | |
CN203070205U (en) | Input equipment based on gesture recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |