CN104700403B - A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support - Google Patents

A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support Download PDF

Info

Publication number
CN104700403B
CN104700403B CN201510071854.3A CN201510071854A CN104700403B CN 104700403 B CN104700403 B CN 104700403B CN 201510071854 A CN201510071854 A CN 201510071854A CN 104700403 B CN104700403 B CN 104700403B
Authority
CN
China
Prior art keywords
module
joint
kinect
hydraulic support
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510071854.3A
Other languages
Chinese (zh)
Other versions
CN104700403A (en
Inventor
刘新华
刘晶晶
王忠宾
谭超
彭俊泉
任衍坤
张秋香
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology CUMT
Original Assignee
China University of Mining and Technology CUMT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology CUMT filed Critical China University of Mining and Technology CUMT
Priority to CN201510071854.3A priority Critical patent/CN104700403B/en
Publication of CN104700403A publication Critical patent/CN104700403A/en
Application granted granted Critical
Publication of CN104700403B publication Critical patent/CN104700403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a kind of gesture based on kinect and control virtual teaching system and the method for hydraulic support, including kinect module, depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module, kinect module is connected with depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module successively.Operator are only needed to make certain gestures, it is possible to observe virtual hydraulic support by display apparatus module and make corresponding action, it is simple to people are better understood from the work process of hydraulic support.

Description

A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support
Technical field
The present invention relates to a kind of gesture based on kinect and control virtual teaching system and the method for hydraulic support, belong to automatic control technology field.
Background technology
Human-computer interaction technology refers to the input and output device by computer, it is achieved people exchanges with computer.Common man-machine interaction mode is the system using the position to staff and pose to carry out real-time tracking, this kind equipment has electromagnetic tracking device, inertial sensor and data glove etc., these belong to the sensor of contact, need to dress extra equipment and make interaction the most natural.And the body feeling interaction equipment kinect of Microsoft is contactless sensor, computer can be controlled by the hand motion of operator, it is not necessary to dress extra equipment on the person so that people can interact with a computer in a natural manner.
Hydraulic support is used to control the works of coal-face mine pressure, and as the necessary equipment of comprehensive mechanical coal mining, it is coal mining machine in full-mechanized mining face, drag conveyor and staff provide safe work space.But people are difficult to the work process of the study understanding hydraulic support of image.In order to improve people's interactive experience sense to hydraulic support work process, need to design the intersection control routine of a kind of people-hydraulic support virtual three-dimensional environment, make people be better understood from the work process of hydraulic support.
Summary of the invention
The problem existed for above-mentioned prior art, the present invention provides virtual teaching system and the method for a kind of gesture based on kinect control hydraulic support, operator are only needed to make certain gestures, just can observe virtual hydraulic support by display apparatus module and make corresponding action, it is simple to people are better understood from the work process of hydraulic support.
To achieve these goals, the technical solution used in the present invention is: this kind gesture based on kinect controls the virtual teaching system of hydraulic support, including kinect module, depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module, kinect module is connected with depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module successively.
This kind gesture based on kinect controls the Virtual Demonstration method of hydraulic support, and it comprises the concrete steps that:
(1) preset eight different standard gestures, and eight gestures are stored in control module;
(2) control module is by the fall post of eight corresponding hydraulic supports of standard gestures difference, shifting frame, setting prop, pushing and sliding, fall face guard, liter face guard, receipts front-axle beam and the control signal rising eight actions of front-axle beam;
(3) initializing kinect module, set the video camera elevation angle, standing stands between distance 1.2 meters to 3.5 meters scopes of photographic head, one of eight standard gestures making setting;
(4) action of kinect module detection operator, and from color stream, deep stream and skeleton stream, obtain skeleton information data frame;
(5) the skeleton information data frame that depth of field data acquisition module analyzing and processing kinect module transmits, obtain image depth data, extract skeleton information, by setting up the 3D coordinate system of human hand and arm joint, obtain human hand and arm joint node coordinate, the position different to identify human body;
(6) articulare message processing module is by the 3D coordinate of obtained human hand and arm joint, calculates the rotational angle on human body shoulder joint three degree of freedom direction, the rotational angle on two degree of freedom directions of elbow joint;According to human body shoulder joint and elbow joint rotational angle information, identify the rotation information of human arm skeleton node, carry out data process by catching the change of different skeletal joint point angle;
(7) each above-mentioned different joint angles information is carried out characteristic matching with each joint angles information of eight standard gestures of storage in control module by gesture feature matching module;If mating unsuccessful, then returning step (3), if the match is successful, then gesture feature matching module passes data to control module;
(8) control module receives gesture feature matching module and transmits the data that the match is successful, after analyzing and processing, send the action control signal of the hydraulic support corresponding with this gesture, and then control three-dimensional hydraulic support action, and shown by display apparatus module.
Compared with prior art, the present invention carries out the collection of gesture information by kinect module, utilization can extract the position in the three-dimensional information tracking each joint of arm of skeleton point, by recognizing specific gesture, control virtual hydraulic support and make corresponding action, it is simple to people are better understood from the work process of hydraulic support.
Accompanying drawing explanation
Fig. 1 is the electric theory diagram of the present invention;
Human arm motion's schematic diagram when Fig. 2 is to make gesture in the present invention;
Fig. 3 is the D-H coordinate system of arm in the present invention;
Fig. 4 is the flow chart of the present invention.
Detailed description of the invention
The invention will be further described below in conjunction with the accompanying drawings.
As depicted in figs. 1 and 2, this kind gesture based on kinect controls the virtual teaching system of hydraulic support, including kinect module, depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module, kinect module is connected with depth of field data acquisition module, articulare message processing module, gesture feature matching module, control module and display apparatus module successively.
This kind gesture based on kinect controls the Virtual Demonstration method of hydraulic support, and it comprises the concrete steps that:
(1) preset eight different standard gestures, and eight gestures are stored in control module;
(2) control module is by the fall post of eight corresponding hydraulic supports of standard gestures difference, shifting frame, setting prop, pushing and sliding, fall face guard, liter face guard, receipts front-axle beam and the control signal rising eight actions of front-axle beam;
(3) initializing kinect module, set the video camera elevation angle, standing stands between distance 1.2 meters to 3.5 meters scopes of photographic head, one of eight standard gestures making setting;
(4) action of kinect module detection operator, and from color stream, deep stream and skeleton stream, obtain skeleton information data frame;
(5) the skeleton information data frame that depth of field data acquisition module analyzing and processing kinect module transmits, obtain image depth data, extract skeleton information, by setting up the 3D coordinate system of human hand and arm joint, obtain human hand and arm joint node coordinate, the position different to identify human body;
(6) articulare message processing module is by the 3D coordinate of obtained human hand and arm joint, calculates the rotational angle on human body shoulder joint three degree of freedom direction, the rotational angle on two degree of freedom directions of elbow joint;According to human body shoulder joint and elbow joint rotational angle information, identifying the rotation information of human arm skeleton node, carry out data process by catching the change of different skeletal joint point angle, concrete process is:
As it is shown on figure 3, the motion of human body right arm shoulder joint to be divided into flexion and extension, telescopic motion, rotary motion, respectively corresponding angle α1、β1、γ1.Human elbow motion is flexion and extension, and corresponding angle is δ1.Frame of reference o is set up at the intermediate point of human body both shoulders joint0-x0y0z0If the 3D coordinate of human body right arm shoulder joint skeleton node is (x1,y1,z1), the 3D coordinate of right arm elbow joint skeleton node is (x2,y2,z2), the 3D coordinate of right arm carpal joint skeleton node is (x3,y3,z3);
Right arm shoulder breadth L 0 = x 1 2 + y 1 2 + z 1 2
The big brachium of right arm L 1 = ( x 2 - x 1 ) 2 + ( y 2 - y 1 ) 2 + ( z 2 - z 1 ) 2
The little brachium of right arm L 2 = ( x 3 - x 2 ) 2 + ( y 3 - y 2 ) 2 + ( z 3 - z 2 ) 2
For shoulder joint flexor ,-40 ° of < α1< 90 °,
α 1 = arctan ( y 2 - y 1 x 2 - x 1 )
For shoulder exhibition receipts action ,-90 ° of < β1< 20 °,
β 1 = arctan ( z 2 - z 1 x 2 - x 1 )
For elbow joint flexor, 0 ° of < δ1< 130 °,
δ 1 = arccos [ ( x 1 - x 2 ) × ( x 2 - x 3 ) + ( y 1 - y 2 ) × ( y 2 - y 3 ) + ( z 1 - z 2 ) × ( z 3 - z 2 ) ] L 1 × L 2
For shoulder joint rotary motion, can not directly be obtained by known coordinate.Here with setting up human synovial D-H coordinate system, draw the coordinate expressions of carpal joint node according to principle of coordinate transformation, then release γ according to known carpal joint node coordinate is counter1
According to the D-H parameter in table 1, write out the affine transformation matrix that each joint is corresponding.
A 1 = 1 0 0 0 0 1 0 0 0 0 1 L 0 0 0 0 1
A 2 = cos θ 2 0 sin θ 2 0 sin θ 2 0 - cos θ 2 0 0 1 0 0 0 0 0 1
A 3 = cos θ 3 0 sin θ 3 0 sin θ 3 0 - cos θ 3 0 0 1 0 0 0 0 0 1
A 4 = cos θ 4 0 sin θ 4 0 sin θ 4 0 - cos θ 4 0 0 1 0 L 1 0 0 0 1
A 5 = cos θ 5 - sin θ 5 0 L 2 cos θ 5 sin θ 5 cos θ 5 0 L 2 sin θ 5 0 0 1 0 0 0 0 1
Obtaining total transformation matrix is
T = A 1 A 2 A 3 A 4 A 5 = n x o x a x p x n y o y a y p y n z o z a z p z 0 0 0 1
Wherein
x3=px=L1cosθ2sinθ3+L2cosθ5(sinθ2sinθ4+cosθ2cosθ3cosθ4)+L2cosθ2sinθ3sinθ5
y3=px=L1sinθ2sinθ3-L2cosθ5(cosθ2sinθ4+cosθ3cosθ4sinθ2)+L2sinθ2sinθ3sinθ5
z3=L0-L1cosθ3-L2cosθ3sinθ5+L2cosθ4cosθ5sinθ3
Wherein
θ 2 = α 1 - π / 2 θ 3 = β 1 - π / 2 θ 4 = γ 1 - π / 2 θ 5 = δ 1
According to known angle and length, substitute in above-mentioned formula, the rotary motion angle γ of shoulder joint can be obtained1(-90°<γ1<45°);
(7) each above-mentioned different joint angles information is carried out characteristic matching with each joint angles information of eight standard gestures of storage in control module by gesture feature matching module;If mating unsuccessful, then returning step (3), if the match is successful and continues 5 seconds, then gesture feature matching module passes data to control module;
(8) control module receives gesture feature matching module and transmits the data that the match is successful, after analyzing and processing, send the action control signal of the hydraulic support corresponding with this gesture, and then control three-dimensional hydraulic support action, and shown by display apparatus module;Table 2 is the hydraulic pressure direct action that each different gesture feature is corresponding.
Table 1:
i α a d θ
1 0 0 L0 0
2 90 0 0 θ2
3 90 0 0 θ3
4 90 0 L1 θ4
5 0 L2 0 θ5
Table 2:
α1 β1 γ1 δ1 Action
1 70 -70 0 90 Fall post
2 30 0 -90 90 Move frame
3 0 -90 0 90 Setting prop
4 0 -90 -90 90 Pushing and sliding
5 0 -90 45 0 Fall face guard
6 90 0 45 90 Rise face guard
7 90 0 45 0 Receive front-axle beam
8 90 -90 0 0 Rise front-axle beam

Claims (1)

1. the Virtual Demonstration method of gesture based on a kinect control hydraulic support, it is characterised in that its concrete steps It is:
(1) preset eight different standard gestures, and eight gestures are stored in control module;
(2) control module is by the fall post of the most corresponding for eight standard gestures hydraulic support, shifting frame, setting prop, pushing and sliding, fall wall supporting Plate, liter face guard, receipts front-axle beam and the control signal of liter eight actions of front-axle beam;
(3) initializing kinect module, set the video camera elevation angle, standing stands in distance photographic head 1.2 meters to 3.5 Between rice scope, one of eight standard gestures making setting;
(4) action of kinect module detection operator, and from color stream, deep stream and skeleton stream, obtain skeleton Information data frame;
(5) the skeleton information data frame that depth of field data acquisition module analyzing and processing kinect module transmits, it is thus achieved that image Depth of field data, extracts skeleton information, by setting up the 3D coordinate system of human hand and arm joint, obtains human hand and arm joint Node coordinate, the position different to identify human body;
(6) articulare message processing module is by the 3D coordinate of obtained human hand and arm joint, calculates human body shoulder joint Rotational angle on joint three degree of freedom direction, the rotational angle on two degree of freedom directions of elbow joint;According to human body shoulder joint With elbow joint rotational angle information, identify the rotation information of human arm skeleton node, by catching different skeletal joint point The change of angle carries out data process;
(7) gesture feature matching module is by each above-mentioned different joint angles information and eight marks of storage in control module Each joint angles information of quasi-gesture carries out characteristic matching;If mating unsuccessful, then return step (3), if the match is successful, Then gesture feature matching module passes data to control module;
(8) control module receives gesture feature matching module and transmits the data that the match is successful, after analyzing and processing, send with The action control signal of the hydraulic support that this gesture is corresponding, and then control three-dimensional hydraulic support action, and by display Device module shows.
CN201510071854.3A 2015-02-11 2015-02-11 A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support Active CN104700403B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510071854.3A CN104700403B (en) 2015-02-11 2015-02-11 A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510071854.3A CN104700403B (en) 2015-02-11 2015-02-11 A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support

Publications (2)

Publication Number Publication Date
CN104700403A CN104700403A (en) 2015-06-10
CN104700403B true CN104700403B (en) 2016-11-09

Family

ID=53347485

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510071854.3A Active CN104700403B (en) 2015-02-11 2015-02-11 A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support

Country Status (1)

Country Link
CN (1) CN104700403B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105500370B (en) * 2015-12-21 2018-11-02 华中科技大学 A kind of robot off-line teaching programing system and method based on body-sensing technology
US20180345491A1 (en) * 2016-01-29 2018-12-06 Mitsubishi Electric Corporation Robot teaching device, and method for generating robot control program
CN108602190B (en) * 2016-02-05 2022-02-18 Abb瑞士股份有限公司 Controlling an industrial robot using interactive commands
JP6585665B2 (en) * 2017-06-29 2019-10-02 ファナック株式会社 Virtual object display system
JP6781201B2 (en) * 2018-06-05 2020-11-04 ファナック株式会社 Virtual object display system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102727362B (en) * 2012-07-20 2014-09-24 上海海事大学 NUI (Natural User Interface)-based peripheral arm motion tracking rehabilitation training system and training method
CN104108097A (en) * 2014-06-25 2014-10-22 陕西高华知本化工科技有限公司 Feeding and discharging mechanical arm system based on gesture control

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Also Published As

Publication number Publication date
CN104700403A (en) 2015-06-10

Similar Documents

Publication Publication Date Title
CN104700403B (en) A kind of gesture based on kinect controls the Virtual Demonstration method of hydraulic support
CN103135756B (en) Generate the method and system of control instruction
CN101943946B (en) Two-dimensional image force touch reproducing control method and system based on three-dimensional force sensor
CN103472916A (en) Man-machine interaction method based on human body gesture recognition
CN107688391A (en) A kind of gesture identification method and device based on monocular vision
CN102508578B (en) Projection positioning device and method as well as interaction system and method
CN104570731A (en) Uncalibrated human-computer interaction control system and method based on Kinect
CN104423569A (en) Pointing position detecting device, method and computer readable recording medium
CN109145802B (en) Kinect-based multi-person gesture man-machine interaction method and device
CN102253713A (en) Display system orienting to three-dimensional images
US20130202212A1 (en) Information processing apparatus, information processing method, and computer program
CN105319991A (en) Kinect visual information-based robot environment identification and operation control method
CN104460951A (en) Human-computer interaction method
CN205068294U (en) Human -computer interaction of robot device
CN104656893A (en) Remote interaction control system and method for physical information space
CN100478860C (en) Electronic plane display positioning system and positioning method
CN103092437A (en) Portable touch interactive system based on image processing technology
CN105513128A (en) Kinect-based three-dimensional data fusion processing method
CN102841679A (en) Non-contact man-machine interaction method and device
CN102479386A (en) Three-dimensional motion tracking method of upper half part of human body based on monocular video
Hou et al. Four-point trapezoidal calibration algorithm for human-computer interaction system based on 3D sensor
CN110142769B (en) ROS platform online mechanical arm demonstration system based on human body posture recognition
CN203386146U (en) Infrared video positioning-based man-machine interactive device
CN105373329A (en) Interactive method and system for display and booth
CN203070205U (en) Input equipment based on gesture recognition

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant