CN104700403B - A virtual teaching method of gesture control hydraulic support based on kinect - Google Patents

A virtual teaching method of gesture control hydraulic support based on kinect Download PDF

Info

Publication number
CN104700403B
CN104700403B CN201510071854.3A CN201510071854A CN104700403B CN 104700403 B CN104700403 B CN 104700403B CN 201510071854 A CN201510071854 A CN 201510071854A CN 104700403 B CN104700403 B CN 104700403B
Authority
CN
China
Prior art keywords
module
joint
kinect
hydraulic support
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510071854.3A
Other languages
Chinese (zh)
Other versions
CN104700403A (en
Inventor
刘新华
刘晶晶
王忠宾
谭超
彭俊泉
任衍坤
张秋香
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology CUMT
Original Assignee
China University of Mining and Technology CUMT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology CUMT filed Critical China University of Mining and Technology CUMT
Priority to CN201510071854.3A priority Critical patent/CN104700403B/en
Publication of CN104700403A publication Critical patent/CN104700403A/en
Application granted granted Critical
Publication of CN104700403B publication Critical patent/CN104700403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本发明公开了一种基于kinect的手势控制液压支架的虚拟示教系统及方法,包括kinect模块、景深数据获取模块、关节点信息处理模块、手势特征匹配模块、控制模块和显示器模块,kinect模块依次与景深数据获取模块、关节点信息处理模块、手势特征匹配模块、控制模块和显示器模块连接。只需操作人员做出特定手势,就可以通过显示器模块观察虚拟液压支架做出相应的动作,便于人们更好的理解液压支架的工作过程。

The invention discloses a virtual teaching system and method for a kinect-based gesture control hydraulic support, comprising a kinect module, a depth of field data acquisition module, a joint point information processing module, a gesture feature matching module, a control module and a display module, and the kinect module in turn It is connected with the depth of field data acquisition module, joint point information processing module, gesture feature matching module, control module and display module. As long as the operator makes a specific gesture, the virtual hydraulic support can be observed through the display module to make corresponding actions, so that people can better understand the working process of the hydraulic support.

Description

一种基于 kinect 的手势控制液压支架的虚拟示教方法A virtual teaching method of gesture control hydraulic support based on kinect

技术领域 technical field

本发明涉及一种基于kinect的手势控制液压支架的虚拟示教系统及方法,属于自动控制技术领域。 The invention relates to a virtual teaching system and method for a kinect-based gesture control hydraulic support and belongs to the technical field of automatic control.

背景技术 Background technique

人机交互技术是指通过计算机的输入和输出设备,实现人与计算机的交流。常见的人机交互方式是采用对人手的位置和位姿进行实时跟踪的系统,这类设备有电磁跟踪设备、惯性传感器与数据手套等,这些属于接触式的传感器,需要穿戴额外的设备而使得交互过程不够自然。而微软公司的体感交互设备kinect是非接触式的传感器,可以通过操作者的手部动作来控制电脑,不需要人身上穿戴额外的设备,使得人们能以自然的方式与计算机交互。 Human-computer interaction technology refers to the communication between human and computer through the input and output devices of the computer. A common way of human-computer interaction is to use a system that tracks the position and posture of the human hand in real time. This type of equipment includes electromagnetic tracking equipment, inertial sensors, and data gloves. These are contact sensors that require wearing additional equipment. The interaction process is not natural enough. Microsoft's somatosensory interactive device kinect is a non-contact sensor, which can control the computer through the operator's hand movements, without the need for people to wear additional equipment, so that people can interact with the computer in a natural way.

液压支架是用来控制采煤工作面矿山压力的结构物,作为综合机械化采煤的必备设备,它为综采工作面采煤机、刮板输送机以及工作人员提供安全的工作空间。但是人们很难形象的学习了解液压支架的工作过程。为了提高人们对液压支架工作过程的交互体验感,需要设计一种人-液压支架虚拟三维环境的交互控制系统,使人们更好的理解液压支架的工作过程。 The hydraulic support is a structure used to control the mine pressure in the coal mining face. As a necessary equipment for comprehensive mechanized coal mining, it provides a safe working space for the shearer, scraper conveyor and staff in the fully mechanized mining face. But it is difficult for people to vividly learn and understand the working process of the hydraulic support. In order to improve people's interactive experience of the working process of the hydraulic support, it is necessary to design a human-hydraulic support virtual three-dimensional environment interactive control system, so that people can better understand the working process of the hydraulic support.

发明内容 Contents of the invention

针对上述现有技术存在的问题,本发明提供一种基于kinect的手势控制液压支架的虚拟示教系统及方法,只需操作人员做出特定手势,就可以通过显示器模块观察虚拟液压支架做出相应的动作,便于人们更好的理解液压支架的工作过程。 Aiming at the problems existing in the above-mentioned prior art, the present invention provides a kinect-based virtual teaching system and method for gesture-controlled hydraulic supports. Only the operator needs to make specific gestures to observe the virtual hydraulic supports through the display module and make corresponding actions. The action is convenient for people to better understand the working process of the hydraulic support.

为了实现上述目的,本发明采用的技术方案是:该种基于kinect的手势控制液压支架的虚拟示教系统,包括kinect模块、景深数据获取模块、关节点信息处理模块、手势特征匹配模块、控制模块和显示器模块,kinect模块依次与景深数据获取模块、关节点信息处理模块、手势特征匹配模块、控制模块和显示器模块连接。 In order to achieve the above object, the technical solution adopted by the present invention is: the virtual teaching system of the kinect-based gesture control hydraulic support includes a kinect module, a depth of field data acquisition module, a joint point information processing module, a gesture feature matching module, and a control module And the display module, the kinect module is sequentially connected with the depth of field data acquisition module, the joint point information processing module, the gesture feature matching module, the control module and the display module.

该种基于kinect的手势控制液压支架的虚拟示教方法,其具体步骤是: The specific steps of the virtual teaching method of the kinect-based gesture control hydraulic support are:

(1)预先设定八个不同的标准手势,并将八个手势存储到控制模块中; (1) preset eight different standard gestures, and store the eight gestures in the control module;

(2)控制模块将八个标准手势分别对应液压支架的降柱、移架、升柱、推溜、降护帮板、升护帮板、收前梁和升前梁八个动作的控制信号; (2) The control module corresponds the eight standard gestures to the control signals of the hydraulic support's eight actions of lowering the column, moving the frame, raising the column, pushing, lowering the side board, raising the side board, retracting the front beam and raising the front beam;

(3)初始化kinect模块,设定摄像机仰角,操作人员站立在距离摄像头1.2米到3.5米范围之间,做出设定的八个标准手势之一; (3) Initialize the kinect module, set the elevation angle of the camera, and the operator stands between 1.2 meters and 3.5 meters away from the camera, and makes one of the eight standard gestures set;

(4)kinect模块检测操作人员的动作,并从颜色流、深度流和骨架流中获取人体骨骼信息数据帧; (4) The kinect module detects the actions of the operator, and obtains the human skeleton information data frame from the color flow, depth flow and skeleton flow;

(5)景深数据获取模块分析处理kinect模块传来的人体骨骼信息数据帧,获得图像景深数据,提取出人体骨骼信息,通过建立人体手臂关节的3D坐标系,得到人体手臂关节节点坐标,以识别人体不同的部位; (5) The depth of field data acquisition module analyzes and processes the human skeleton information data frame sent by the kinect module, obtains the image depth of field data, extracts the human skeleton information, and obtains the coordinates of the human arm joint nodes by establishing the 3D coordinate system of the human arm joints to identify different parts of the human body;

(6)关节点信息处理模块通过所得到的人体手臂关节的3D坐标,计算得出人体肩关节三个自由度方向上的转动角度,肘关节两个自由度方向上的转动角度;根据人体肩关节和肘关节转动角度信息,识别出人体手臂骨骼节点的转动信息,通过捕捉不同骨骼关节点角度的变化来进行数据处理; (6) The joint point information processing module calculates the rotation angle on the three degrees of freedom directions of the human shoulder joint and the rotation angle on the two degrees of freedom directions of the elbow joint through the obtained 3D coordinates of the human arm joint; Rotation angle information of joints and elbow joints, identifying the rotation information of human arm bone nodes, and performing data processing by capturing changes in the angles of different bone joint points;

(7)手势特征匹配模块将上述的各个不同的关节角度信息与控制模块中存储的八个标准手势的各个关节角度信息进行特征匹配;若匹配未成功,则返回步骤(3),若匹配成功,则手势特征匹配模块将数据传递给控制模块; (7) The gesture feature matching module carries out feature matching with each of the above-mentioned different joint angle information and each joint angle information of eight standard gestures stored in the control module; if the matching is not successful, then return to step (3), if the matching is successful , the gesture feature matching module transmits the data to the control module;

(8)控制模块接收到手势特征匹配模块传来的匹配成功的数据,分析处理后,发出与该手势相对应的液压支架的动作控制信号,进而控制三维虚拟液压支架动作,并通过显示器模块显示。 (8) The control module receives the data of successful matching from the gesture feature matching module, and after analyzing and processing, sends out the action control signal of the hydraulic support corresponding to the gesture, and then controls the action of the three-dimensional virtual hydraulic support, and displays it through the display module .

与现有技术相比,本发明通过kinect模块进行手势信息的采集,利用可提取骨骼点的三维信息追踪手臂各关节的位置,通过辨认特定的手势,控制虚拟液压支架做出相应的动作,便于人们更好的理解液压支架的工作过程。 Compared with the prior art, the present invention uses the kinect module to collect gesture information, uses the three-dimensional information that can extract bone points to track the position of each joint of the arm, and controls the virtual hydraulic support to make corresponding actions by identifying specific gestures, which is convenient People better understand the working process of hydraulic supports.

附图说明 Description of drawings

图1是本发明的电原理框图; Fig. 1 is an electrical principle block diagram of the present invention;

图2是本发明中做出手势时人体手臂运动示意图; Fig. 2 is a schematic diagram of human arm movement when gestures are made in the present invention;

图3是本发明中手臂的D-H坐标系; Fig. 3 is the D-H coordinate system of arm among the present invention;

图4是本发明的流程图。 Fig. 4 is a flowchart of the present invention.

具体实施方式 detailed description

下面结合附图对本发明作进一步说明。 The present invention will be further described below in conjunction with accompanying drawing.

如图1和图2所示,该种基于kinect的手势控制液压支架的虚拟示教系统,包括kinect模块、景深数据获取模块、关节点信息处理模块、手势特征匹配模块、控制模块和显示器模块,kinect模块依次与景深数据获取模块、关节点信息处理模块、手势特征匹配模块、控制模块和显示器模块连接。 As shown in Fig. 1 and Fig. 2, the virtual teaching system of the kinect-based gesture control hydraulic support includes a kinect module, a depth of field data acquisition module, a joint point information processing module, a gesture feature matching module, a control module and a display module, The kinect module is sequentially connected with the depth of field data acquisition module, joint point information processing module, gesture feature matching module, control module and display module.

该种基于kinect的手势控制液压支架的虚拟示教方法,其具体步骤是: The specific steps of the virtual teaching method of the kinect-based gesture control hydraulic support are:

(1)预先设定八个不同的标准手势,并将八个手势存储到控制模块中; (1) preset eight different standard gestures, and store the eight gestures in the control module;

(2)控制模块将八个标准手势分别对应液压支架的降柱、移架、升柱、推溜、降护帮板、升护帮板、收前梁和升前梁八个动作的控制信号; (2) The control module corresponds the eight standard gestures to the control signals of the hydraulic support's eight actions of lowering the column, moving the frame, raising the column, pushing, lowering the side board, raising the side board, retracting the front beam and raising the front beam;

(3)初始化kinect模块,设定摄像机仰角,操作人员站立在距离摄像头1.2米到3.5米范围之间,做出设定的八个标准手势之一; (3) Initialize the kinect module, set the elevation angle of the camera, and the operator stands between 1.2 meters and 3.5 meters away from the camera, and makes one of the eight standard gestures set;

(4)kinect模块检测操作人员的动作,并从颜色流、深度流和骨架流中获取人体骨骼信息数据帧; (4) The kinect module detects the actions of the operator, and obtains the human skeleton information data frame from the color flow, depth flow and skeleton flow;

(5)景深数据获取模块分析处理kinect模块传来的人体骨骼信息数据帧,获得图像景深数据,提取出人体骨骼信息,通过建立人体手臂关节的3D坐标系,得到人体手臂关节节点坐标,以识别人体不同的部位; (5) The depth of field data acquisition module analyzes and processes the human skeleton information data frame sent by the kinect module, obtains the image depth of field data, extracts the human skeleton information, and obtains the coordinates of the human arm joint nodes by establishing the 3D coordinate system of the human arm joints to identify different parts of the human body;

(6)关节点信息处理模块通过所得到的人体手臂关节的3D坐标,计算得出人体肩关节三个自由度方向上的转动角度,肘关节两个自由度方向上的转动角度;根据人体肩关节和肘关节转动角度信息,识别出人体手臂骨骼节点的转动信息,通过捕捉不同骨骼关节点角度的变化来进行数据处理,具体的过程为: (6) The joint point information processing module calculates the rotation angle on the three degrees of freedom directions of the human shoulder joint and the rotation angle on the two degrees of freedom directions of the elbow joint through the obtained 3D coordinates of the human arm joint; Joint and elbow rotation angle information, identify the rotation information of human arm bone nodes, and perform data processing by capturing changes in the angles of different bone joint points. The specific process is:

如图3所示,将人体右臂肩关节的运动分为屈伸运动,展收运动,旋转运动,分别对应角度α1、β1、γ1。人体肘关节运动为屈伸运动,对应角度为δ1。在人体双肩关节中间点处建立基准坐标系o0-x0y0z0,设人体右臂肩关节骨骼节点的3D坐标为(x1,y1,z1),右臂肘关节骨骼节点的3D坐标为(x2,y2,z2),右臂腕关节骨骼节点的3D坐标为(x3,y3,z3); As shown in FIG. 3 , the movement of the shoulder joint of the right arm of the human body is divided into flexion and extension movement, extension and retraction movement, and rotation movement, corresponding to angles α 1 , β 1 , and γ 1 . The movement of the human elbow joint is flexion and extension, and the corresponding angle is δ 1 . Establish a reference coordinate system o 0 -x 0 y 0 z 0 at the middle point of the human shoulder joint, set the 3D coordinates of the shoulder joint bone node of the right arm of the human body as (x 1 , y 1 , z 1 ), and the elbow joint bone node of the right arm The 3D coordinates of is (x 2 , y 2 , z 2 ), the 3D coordinates of the right arm wrist joint bone node are (x 3 , y 3 , z 3 );

右臂肩宽 L 0 = x 1 2 + y 1 2 + z 1 2 right arm shoulder width L 0 = x 1 2 + the y 1 2 + z 1 2

右臂大臂长 L 1 = ( x 2 - x 1 ) 2 + ( y 2 - y 1 ) 2 + ( z 2 - z 1 ) 2 right arm length L 1 = ( x 2 - x 1 ) 2 + ( the y 2 - the y 1 ) 2 + ( z 2 - z 1 ) 2

右臂小臂长 L 2 = ( x 3 - x 2 ) 2 + ( y 3 - y 2 ) 2 + ( z 3 - z 2 ) 2 Right arm forearm length L 2 = ( x 3 - x 2 ) 2 + ( the y 3 - the y 2 ) 2 + ( z 3 - z 2 ) 2

对于肩关节屈伸动作,-40°<α1<90°, For shoulder flexion and extension, -40°<α 1 <90°,

&alpha;&alpha; 11 == arctanarctan (( ythe y 22 -- ythe y 11 xx 22 -- xx 11 ))

对于肩展收动作,-90°<β1<20°, For shoulder extension and retraction, -90°<β 1 <20°,

&beta;&beta; 11 == arctanarctan (( zz 22 -- zz 11 xx 22 -- xx 11 ))

对于肘关节屈伸动作,0°<δ1<130°, For elbow flexion and extension, 0°<δ 1 <130°,

&delta;&delta; 11 == arccosarccos [[ (( xx 11 -- xx 22 )) &times;&times; (( xx 22 -- xx 33 )) ++ (( ythe y 11 -- ythe y 22 )) &times;&times; (( ythe y 22 -- ythe y 33 )) ++ (( zz 11 -- zz 22 )) &times;&times; (( zz 33 -- zz 22 )) ]] LL 11 &times;&times; LL 22

对于肩关节旋转运动,并不能直接由已知坐标求出。这里利用建立人体关节D-H坐标系,根据坐标变换原理得出腕关节节点的坐标表达式,然后根据已知的腕关节节点坐标反推出γ1For the shoulder joint rotation motion, it cannot be calculated directly from the known coordinates. Here, the DH coordinate system of the human body joint is established, and the coordinate expression of the wrist joint node is obtained according to the coordinate transformation principle, and then γ 1 is deduced according to the known coordinates of the wrist joint node.

根据表1中的D-H参数,写出各个关节对应的仿射变换矩阵。 According to the D-H parameters in Table 1, write out the affine transformation matrix corresponding to each joint.

AA 11 == 11 00 00 00 00 11 00 00 00 00 11 LL 00 00 00 00 11

AA 22 == coscos &theta;&theta; 22 00 sinsin &theta;&theta; 22 00 sinsin &theta;&theta; 22 00 -- coscos &theta;&theta; 22 00 00 11 00 00 00 00 00 11

AA 33 == coscos &theta;&theta; 33 00 sinsin &theta;&theta; 33 00 sinsin &theta;&theta; 33 00 -- coscos &theta;&theta; 33 00 00 11 00 00 00 00 00 11

AA 44 == coscos &theta;&theta; 44 00 sinsin &theta;&theta; 44 00 sinsin &theta;&theta; 44 00 -- coscos &theta;&theta; 44 00 00 11 00 LL 11 00 00 00 11

AA 55 == coscos &theta;&theta; 55 -- sinsin &theta;&theta; 55 00 LL 22 coscos &theta;&theta; 55 sinsin &theta;&theta; 55 coscos &theta;&theta; 55 00 LL 22 sinsin &theta;&theta; 55 00 00 11 00 00 00 00 11

得到总的变换矩阵为 The total transformation matrix is obtained as

TT == AA 11 AA 22 AA 33 AA 44 AA 55 == nno xx oo xx aa xx pp xx nno ythe y oo ythe y aa ythe y pp ythe y nno zz oo zz aa zz pp zz 00 00 00 11

其中 in

x3=px=L1cosθ2sinθ3+L2cosθ5(sinθ2sinθ4+cosθ2cosθ3cosθ4)+L2cosθ2sinθ3sinθ5 x 3 =p x =L 1 cosθ 2 sinθ 3 +L 2 cosθ 5 (sinθ 2 sinθ 4 +cosθ 2 cosθ 3 cosθ 4 )+L 2 cosθ 2 sinθ 3 sinθ 5

y3=px=L1sinθ2sinθ3-L2cosθ5(cosθ2sinθ4+cosθ3cosθ4sinθ2)+L2sinθ2sinθ3sinθ5 y 3 =p x =L 1 sinθ 2 sinθ 3 -L 2 cosθ 5 (cosθ 2 sinθ 4 +cosθ 3 cosθ 4 sinθ 2 )+L 2 sinθ 2 sinθ 3 sinθ 5

z3=L0-L1cosθ3-L2cosθ3sinθ5+L2cosθ4cosθ5sinθ3 z 3 =L 0 -L 1 cosθ 3 -L 2 cosθ 3 sinθ 5 +L 2 cosθ 4 cosθ 5 sinθ 3

其中 in

&theta;&theta; 22 == &alpha;&alpha; 11 -- &pi;&pi; // 22 &theta;&theta; 33 == &beta;&beta; 11 -- &pi;&pi; // 22 &theta;&theta; 44 == &gamma;&gamma; 11 -- &pi;&pi; // 22 &theta;&theta; 55 == &delta;&delta; 11

根据已知的角度和长度,代入上述公式中,可求出肩关节的旋转运动角度γ1(-90°<γ1<45°); According to the known angle and length, substituting into the above formula, the rotation angle γ 1 of the shoulder joint can be obtained (-90°<γ 1 <45°);

(7)手势特征匹配模块将上述的各个不同的关节角度信息与控制模块中存储的八个标准手势的各个关节角度信息进行特征匹配;若匹配未成功,则返回步骤(3),若匹配成功且持续5秒,则手势特征匹配模块将数据传递给控制模块; (7) The gesture feature matching module carries out feature matching with each of the above-mentioned different joint angle information and each joint angle information of eight standard gestures stored in the control module; if the matching is not successful, then return to step (3), if the matching is successful And continue for 5 seconds, then the gesture feature matching module transmits the data to the control module;

(8)控制模块接收到手势特征匹配模块传来的匹配成功的数据,分析处理后,发出与该手势相对应的液压支架的动作控制信号,进而控制三维虚拟液压支架动作,并通过显示器模块显示;表2是每个不同的手势特征对应的液压直接动作。 (8) The control module receives the data of successful matching from the gesture feature matching module, and after analyzing and processing, sends out the action control signal of the hydraulic support corresponding to the gesture, and then controls the action of the three-dimensional virtual hydraulic support, and displays it through the display module ; Table 2 is the hydraulic direct action corresponding to each different gesture feature.

表1: Table 1:

i i α alpha a a d d θ θ 1 1 0 0 0 0 L0 L 0 0 0 2 2 90 90 0 0 0 0 θ2 θ 2 3 3 90 90 0 0 0 0 θ3 θ 3 4 4 90 90 0 0 L1 L 1 θ4 θ 4 5 5 0 0 L2 L 2 0 0 θ5 θ 5

表2: Table 2:

α1 alpha 1 β1 beta 1 γ1 | 1 δ1 δ 1 动作 action 1 1 70 70 -70 -70 0 0 90 90 降柱drop column 2 2 30 30 0 0 -90 -90 90 90 移架Move frame 3 3 0 0 -90 -90 0 0 90 90 升柱rising column 4 4 0 0 -90 -90 -90 -90 90 90 推溜push back 5 5 0 0 -90 -90 45 45 0 0 降护帮板 Lower guard 6 6 90 90 0 0 45 45 90 90 升护帮板 Lift guard 7 7 90 90 0 0 45 45 0 0 收前梁 front beam 8 8 90 90 -90 -90 0 0 0 0 升前梁 Shengqian beam

Claims (1)

1. the Virtual Demonstration method of gesture based on a kinect control hydraulic support, it is characterised in that its concrete steps It is:
(1) preset eight different standard gestures, and eight gestures are stored in control module;
(2) control module is by the fall post of the most corresponding for eight standard gestures hydraulic support, shifting frame, setting prop, pushing and sliding, fall wall supporting Plate, liter face guard, receipts front-axle beam and the control signal of liter eight actions of front-axle beam;
(3) initializing kinect module, set the video camera elevation angle, standing stands in distance photographic head 1.2 meters to 3.5 Between rice scope, one of eight standard gestures making setting;
(4) action of kinect module detection operator, and from color stream, deep stream and skeleton stream, obtain skeleton Information data frame;
(5) the skeleton information data frame that depth of field data acquisition module analyzing and processing kinect module transmits, it is thus achieved that image Depth of field data, extracts skeleton information, by setting up the 3D coordinate system of human hand and arm joint, obtains human hand and arm joint Node coordinate, the position different to identify human body;
(6) articulare message processing module is by the 3D coordinate of obtained human hand and arm joint, calculates human body shoulder joint Rotational angle on joint three degree of freedom direction, the rotational angle on two degree of freedom directions of elbow joint;According to human body shoulder joint With elbow joint rotational angle information, identify the rotation information of human arm skeleton node, by catching different skeletal joint point The change of angle carries out data process;
(7) gesture feature matching module is by each above-mentioned different joint angles information and eight marks of storage in control module Each joint angles information of quasi-gesture carries out characteristic matching;If mating unsuccessful, then return step (3), if the match is successful, Then gesture feature matching module passes data to control module;
(8) control module receives gesture feature matching module and transmits the data that the match is successful, after analyzing and processing, send with The action control signal of the hydraulic support that this gesture is corresponding, and then control three-dimensional hydraulic support action, and by display Device module shows.
CN201510071854.3A 2015-02-11 2015-02-11 A virtual teaching method of gesture control hydraulic support based on kinect Active CN104700403B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510071854.3A CN104700403B (en) 2015-02-11 2015-02-11 A virtual teaching method of gesture control hydraulic support based on kinect

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510071854.3A CN104700403B (en) 2015-02-11 2015-02-11 A virtual teaching method of gesture control hydraulic support based on kinect

Publications (2)

Publication Number Publication Date
CN104700403A CN104700403A (en) 2015-06-10
CN104700403B true CN104700403B (en) 2016-11-09

Family

ID=53347485

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510071854.3A Active CN104700403B (en) 2015-02-11 2015-02-11 A virtual teaching method of gesture control hydraulic support based on kinect

Country Status (1)

Country Link
CN (1) CN104700403B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105500370B (en) * 2015-12-21 2018-11-02 华中科技大学 A kind of robot off-line teaching programing system and method based on body-sensing technology
DE112016006116T5 (en) * 2016-01-29 2018-09-13 Mitsubishi Electric Corporation A robotic teaching apparatus and method for generating a robotic control program
EP3411195B1 (en) * 2016-02-05 2022-08-31 ABB Schweiz AG Controlling an industrial robot using interactive commands
JP6585665B2 (en) * 2017-06-29 2019-10-02 ファナック株式会社 Virtual object display system
JP6781201B2 (en) * 2018-06-05 2020-11-04 ファナック株式会社 Virtual object display system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102727362B (en) * 2012-07-20 2014-09-24 上海海事大学 Rehabilitation training system and training method based on somatosensory peripheral arm movement tracking
CN104108097A (en) * 2014-06-25 2014-10-22 陕西高华知本化工科技有限公司 Feeding and discharging mechanical arm system based on gesture control

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Also Published As

Publication number Publication date
CN104700403A (en) 2015-06-10

Similar Documents

Publication Publication Date Title
CN104700403B (en) A virtual teaching method of gesture control hydraulic support based on kinect
CN103713737A (en) Virtual keyboard system used for Google glasses
CN103977539B (en) Cervical vertebra rehabilitation health care auxiliary training system
CN104570731A (en) Uncalibrated human-computer interaction control system and method based on Kinect
CN105252532A (en) Method of cooperative flexible attitude control for motion capture robot
US9008442B2 (en) Information processing apparatus, information processing method, and computer program
Mashood et al. A gesture based kinect for quadrotor control
CN103809733A (en) Man-machine interactive system and method
CN102103409A (en) Man-machine interaction method and device based on motion trail identification
CN101510121A (en) Interface roaming operation method and apparatus based on gesture identification
CN108828996A (en) A kind of the mechanical arm remote control system and method for view-based access control model information
CN105319991A (en) Kinect visual information-based robot environment identification and operation control method
CN104571597A (en) Thumb metacarpophalangeal joint movement detection device
CN105787941A (en) Rapid rectification method of mixed reality simulation system
CN109242887A (en) A kind of real-time body&#39;s upper limks movements method for catching based on multiple-camera and IMU
CN107783654B (en) Body is bent class motion detection method in a kind of operation of Virtual assemble
CN106227368A (en) A kind of human synovial angle calculation method and device
CN110142769A (en) ROS platform online robotic arm teaching system based on human gesture recognition
CN102479386A (en) Monocular video-based three-dimensional motion tracking method for upper half of human body
CN106362377A (en) Mixed action capture training apparatus
CN108115671A (en) Tow-armed robot control method and system based on 3D visual sensors
Kryuchkov et al. Simulation of the «cosmonaut-robot» system interaction on the lunar surface based on methods of machine vision and computer graphics
CN203070205U (en) Input equipment based on gesture recognition
CN106445193B (en) A kind of laser pen demo system based on mobile phone manipulation
CN211236844U (en) Simulation equipment capable of generating real touch in virtual reality environment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant