CN104570731A - Uncalibrated human-computer interaction control system and method based on Kinect - Google Patents
Uncalibrated human-computer interaction control system and method based on Kinect Download PDFInfo
- Publication number
- CN104570731A CN104570731A CN201410733412.6A CN201410733412A CN104570731A CN 104570731 A CN104570731 A CN 104570731A CN 201410733412 A CN201410733412 A CN 201410733412A CN 104570731 A CN104570731 A CN 104570731A
- Authority
- CN
- China
- Prior art keywords
- mechanical arm
- kinect
- human
- hand
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Manipulator (AREA)
Abstract
The invention discloses an uncalibrated human-computer interaction control system and method based on Kinect. The system comprises human skeleton information acquisition, a computer software system and a mechanical arm, wherein human depth image information acquisition is mainly accomplished by a Kinect sensor; the computer software system processes acquired field depth data by utilizing a skeleton tracking technology and establishes a 3D (three-dimensional) coordinate of 20 human skeleton points; and a control system of the mechanical arm acquires a control command sent by conversion of the computer system, and controls an end actuator to follow hand motion to achieve real-time guide of a hand to the end actuator of the mechanical arm. According to the system and the method, master-slave control is performed by constructing a motion mapping relation of the hand relative to a hipbone center and the end actuator relative to a base of the mechanical arm, so that real-time interaction of the hand and the mechanical arm is achieved. An experimental result indicates that the system can well accomplish hand tracking and master-slave control tasks, and has higher timeliness and interactivity.
Description
Technical field
The invention belongs to Information barrier-free field of engineering technology, specifically, relate to a kind of ca libration-free human-computer interactive control system and method based on Kinect.
Background technology
Principal and subordinate's remote operating mechanical arm control system is widely used in the high risk operation such as simulated training of nuclear reactor maintenance, manned space flight, medical operating, and hand is the most frequently used position of human body.Therefore, realize mechanical arm and follow perception and the capacity that hand exercise can greatly enhance people.The various motions of research hand, the interactive mode of hand and mechanical arm, has important theory significance and realistic meaning.
At present, existing hand track and localization technology mainly contains two kinds, based on data glove technology and the gesture identification based on image vision process.Data glove technology due to itself and circuit connection thereof comparatively heavy, be inconvenient to use, adds that the factor such as expensive limits it and applies.The people such as Weinland propose the motion by 3D modeling and HMM method identification staff.Liu etc. utilize RFID method to follow the tracks of and identify hand state in the labelled method in experimenter's wrist place, and this method can determine the wrist location of experimenter.But coloured image is easily by lighting conditions, and the motion target tracking of traditional view-based access control model needs to demarcate camera inside and outside parameter, but in practical situations both cannot Accurate Calibration.
Summary of the invention
In order to overcome the defect existed in prior art, the invention provides a kind of ca libration-free human-computer interactive control system and method based on Kinect, Kinect is a stage body sense video camera, background subtracting can be carried out on each the frame depth image obtained and obtain human body each joint 3D coordinate, i.e. skeleton information, and ca libration-free obtains the positional information of staff from skeleton information.Given this, the present invention utilizes Kinect bone tracer technique to obtain human hand movement information, and builds the Motion mapping relation of itself and mechanical arm, studies with regard to the master-slave control method between them.Its technical scheme is as follows:
A kind of ca libration-free human-computer interactive control system based on Kinect, mainly contain four parts formations and comprise operator, Kinect sensor, computer software and mechanical arm control system, wherein operator is in the work space of Kinect, Kinect is connected by USB data line computer system, mechanical arm control system is connected with computer system by RS232, whole system mainly comprises four large functional modules, human depth's image information acquisition module respectively, skeleton point tracking module, Analysis of Inverse Kinematics module and slave computer communication module, human depth's image information acquisition module completes primarily of Kinect sensor, it sends computer system with the speed of 30 frames per second to depth image, skeleton point tracking module is completed by computer software, and the work mainly completed comprises the smoothing processing of Image semantic classification, human joint points identification and skeleton data, the groundwork of the computation of inverse-kinematics module comprise staff relative to the motion of self hip center and mechanical arm relative to the Mapping and Converting between base motion and mechanical arm Analysis of Inverse Kinematics with solve, slave computer communication module mainly completes the transmitting-receiving work of the generation of steering order, standardized instruction form and instruction.
Preferably, in order to ensure the real-time of system, depth image adopts the resolution of 640*480, and frame per second is 30f/s; In order to ensure the accuracy of depth distance, operator should to be in about sensor optical axis 57 °, the working range of upper and lower 43 °, employs threshold method filter depth data to 1220mm-3810mm within the scope of this simultaneously.
Based on a ca libration-free man-machine interaction control method of Kinect, comprise the following steps:
The first step: human depth's image information acquisition;
Second step: the smoothing processing of Image semantic classification, human joint points identification and skeleton data;
3rd step: staff relative to the motion of self hip center and mechanical arm relative to the Mapping and Converting between base motion and mechanical arm Analysis of Inverse Kinematics with solve;
4th step: the transmitting-receiving of the generation of steering order, standardized instruction form and instruction.
Beneficial effect of the present invention:
The present invention is a kind of novel man-machine interaction mode, Kinect sensor bone tracer technique process depth of field data is utilized to obtain hand position, build hand and carry out master & slave control relative to hip center and end effector relative to the Motion mapping relation of mechanical arm pedestal, realize the real-time, interactive of staff and robotic arm.And for hand jitter elimination and outlier processing, propose a kind of moving average smooth trajectory algorithm of position-based increment.Experimental result shows that this system can be good at hand and followed the tracks of and master & slave control task, has higher real-time and interactivity.
Accompanying drawing explanation
Fig. 1 is the frame diagram of the ca libration-free human-computer interactive control system based on Kinect;
Fig. 2 is the process flow diagram of the ca libration-free man-machine interaction control method based on Kinect;
Fig. 3 is the definition of human body 20 skeletal joint points;
Fig. 4 is end smoothing processing implementation procedure;
Fig. 5 is five shaft mechanical arm link rod coordinate systems.
Embodiment
Below in conjunction with the drawings and specific embodiments, technical scheme of the present invention is described in more detail.
1 system framework
System chart is illustrated in fig. 1 shown below:
Human depth's image information acquisition completes primarily of Kinect sensor 2.The depth of field data that computer software 3 uses the process of bone tracer technique to obtain, sets up the 3D coordinate of human body 20 skeleton points.Mechanical arm control system 4 obtains the steering order that computer system conversion sends, and controls end effector and follows hand exercise, realize the guiding in real time of staff to robot arm end effector.Wherein operator 1 is in the work space of Kinect.
2 obtain hand information and master-slave control method
2.1Kinect brief introduction
Kinect is that a of Microsoft's exploitation can the equipment of Real-time Obtaining coloured image and depth image data, supports real-time whole body and half body bone tracing mode simultaneously, and can identify a series of action.It is made up of RGB colour imagery shot, infrared transmitter and infrared C OMS video camera (IR) three part.
The acquisition of 2.2 human detection and hand information
In master & slave control system of the present invention, a gordian technique is exactly the accurate acquisition of staff pose.The method that current moving object detection and tracking are commonly used has: frame differential method, optical flow method and background subtraction.The shortcoming of optical flow method needs successive ignition just can restrain, and calculated amount is large, is difficult to requirement of real time; The problem of frame difference method is that frame motion overlapping region in front and back has cavity and produces, Detection results out of true, and the selection of threshold value is comparatively large to Influence on test result, and threshold value is crossed conference and formed cavity, and threshold value is too small, can produce noise.Current frame image and background model, by setting up background model, subtract each other by traditional background subtraction, can determine position and the shape information of target.But coloured image is subject to the impact of illumination variation, complex background, cause target tracking difficulty and segmentation out of true.The depth image having some scholars to utilize Kinect to generate carries out background subtracting, obtains human depth's image of segmentationization, obtain good effect.For this reason, author proposes a kind of method utilizing the noncontact of Kinect skeleton tracer technique to obtain human hand movement information, detailed process is: be sent to by human depth's image of segmentationization and distinguish in the motion capture machine learning system of human body, utilize the sorting technique of random forest to devise the intermediate representation of health assembly, pose estimation problem is mapped as by pixel classifications problem.By the result of re-projection sorter (estimation of health assembly), the credible 3D generating human synovial estimates.20 articulation points that the last basis for the treatment of scheme tracks build human skeleton system, the definition of Fig. 3 display to skeleton point.After obtaining skeleton data, be easy to obtain the positional information of staff relative to self hip center.
This method saves complicated camera calibration process, obtains hand position method can obtain the higher hand position information of precision with respect to camera calibration.
2.3 coordinate data smoothing processing
Due to problems such as the unstable properties of Kinect hardware, operator's action are coherent not, estimate that the relative position of skeletal joint point may change between frames by said method very large, in skeleton data sequence, may exceptional value be contained.Therefore, must first be identified exceptional value and be rejected before carrying out inverse kinematics, and noise reduction and smoothing processing are carried out to skeleton data.
It is using the master & slave control positional information of the mean value of the staff positional information of current time and the staff positional information in the front N-1 sampling period as current time that moving average smooth trajectory calculates ratio juris, and the positional information that Kinect collects is passed item by item with step-length N according to time series, finally obtain the positional value expected in whole master & slave control process, realize the control from mechanical arm tail end position.This algorithm filtering can periodically change impact on position curve smoothing, and the implementation procedure of smooth trajectory process as shown in Figure 4.
Moving average smooth trajectory Processing Algorithm elimination staff periodic jitter and other random disturbance while, also likely can filtering operator expect position operation information.In order to while elimination hand tremor, the positional information of reservation operations person's expectation as far as possible, the present invention improves current location information weight shared in position after smoothing processing on the basis of gliding smoothing algorithm.Assuming that the time displacement step-length of smooth trajectory Processing Algorithm is N, i is the weight of current location information in planned position information, then as k >=N, the planned position in k moment can be calculated by following formula and obtain:
As k < N, the planned position in k moment is:
In the staff space motion location information collected, except the interference that people's hand motion is coherent not and shake produces, also comprise the interference of the exceptional value of randomness.Due to the master-slave control strategy that the present invention is position-based increment, therefore author is in the process solving positional increment, not directly smooth trajectory Processing Algorithm is carried out to the hand position information collected, but the smoothing process of increment to adjacent moment people hand position.The erratic variation of the hand position information adopting this mode can better utilize the filtering of moving average algorithm to collect.Empirical tests, adopts the moving average smooth trajectory algorithm of position-based increment to carry out processing the interference can eliminating major part shake and exceptional value in hand exercise process.
2.4 hands are to the master-slave control method of slave manipulator arm
During hand controller mechanical arm, the motion of staff must map with the motion of mechanical arm.In master & slave control system, Motion mapping pattern has two kinds usually, and one is that joint space maps, and another kind is that cartesian space maps.Operation task of the present invention mainly controls for mechanical arm tail end pose, and therefore system takes cartesian space mapped mode, that is: using the input of the pose of hand as robot arm end effector pose.On control mode is selected, adopt increment type to control, on the pose of the end effector of current robot that the motion pose increment of staff is added to, realize hand and the guiding in real time of mechanical arm is controlled.
The positional information of staff and hip center is under Kinect coordinate system, in order to set up above-mentioned Motion mapping relation, need by its coordinate conversion under robot coordinate system, and concrete methods of realizing is as follows:
The first step: extract the right hand at the Kinect coordinate system { position vector K} from bone information stream SkeletonFrame
kp
1=[x
1, y
1, z
1]
t, and hip center is at { the position vector under K}
kp
2=[x
2, y
2, z
2]
t.Just can obtain Kinect coordinate system without the need to carrying out camera demarcating { the K} operator right hand is relative to the position of self hip center thus
kp:
KP=
KP
1-
KP
2(3)
Second step: it is all at Kinect camera coordinate system that bone follows the tracks of coordinate system, and initial point is infrared camera center, and Z axis is infrared camera optical axis, and X-direction is horizontal direction, and Y-axis is vertical direction.In order to obtain the coordinate of mechanical arm tail end relative to pedestal
bp, demand goes out Kinect camera coordinate system, and { K} is transformed into robot coordinate system's { rotation matrix of B}
{ K} rotates-90 ° by the right-hand rule around Z axis to Kinect camera coordinate system, then rotates-90 ° around X-axis, can obtain rotation matrix
3rd step: finally obtain the coordinate of mechanical arm tail end relative to pedestal
bp is:
3 mechanical arm inverse kinematics
As shown in Figure 5, its D-H parameter is as shown in table 1 below for the link rod coordinate system of the five degree-of-freedom manipulator that the present invention uses:
The D-H parameter list of table 1 mechanical arm
In cartesian space, robotic arm Inverse Kinematic Problem is the attitude of known end opposite base coordinate system and position, robotic arm geometric parameter, determines the process of its each joint variable.The present invention adopts inverse transformation method, namely before transformation matrix, takes advantage of one or several inverse-transform matrixs, by comparison equation both sides corresponding element, reaches the object solving Inverse Kinematics Solution.As mentioned above, the position vector of mechanical arm tail end under basis coordinates system can be obtained by formula (5)
bp=[x, y, z]
t, obtain according to table 1, based on D-H parameters Mechanical arm ending coordinates, system relative to the homogeneous transform matrix of base coordinate system is:
Wherein
r
11=c
1c
234c
5+s
1s
5;r
12=-c
1c
234c
5+s
1s
5;r
13=-c
1s
234;r
21=s
1c
234c
5-c
1s
5;r
22=-s
1c
234c
5-c
1s
5;r
23=-s
1s
234;r
31=-s
234c
5;r
32=s
234c
5;r
33=-c
234;p
x=c
1(a
2c
2+a
3c
23-d
5s
234);p
y=s
1(a
2c
2+a
3c
23-d
5s
234);p
z=-a
2s
2-a
3s
23+d
1-d
5c
234.
S in formula
234=sin (θ
2+ θ
3+ θ
4), c
234=cos (θ
2+ θ
3+ θ
4), other situation is similar.
Kinematical equation is:
The expression formula can obtaining each joint angle of mechanical arm by algebraic approach by (6), (7) is as follows:
θ
1=Atan2(p
x,p
y) (8)
θ
2=Atan2(s
2,c
2) (9)
θ
5=Atan2(r
11p
y-r
21p
x,r
12p
y-r
22p
x) (12)
Wherein
Like this, algebraic approach is utilized just to obtain all joint angles of five degree-of-freedom manipulator.Because some joint of mechanical arm exists symmetrical angle, may there is the inverse situation of separating of many groups in certain pose that said method tries to achieve robot arm end effector.But joint motions are subject to the restriction of displacement scope, so some inverse pose separating correspondence can not realize.When carrying out working control to mechanical arm, the present invention utilizes joint spacing and accept or reject from the angle of optimal path the inverse solution of many groups.
4 experimental results and analysis
The present invention adopt MRT-I type five degree-of-freedom manipulator as controlled device, Kinect coloured image resolution be 640 × 480, depth image resolution is 320 × 240; Software development environment: Window7+Visual Studio2010+Kinect SDK-v1.7; Development language: C#.System flow as shown in Figure 2.
4.1 experiment related description
In order to verify the feasibility of native system and the discrimination under different illumination conditions, designed mechanical arm Pick-and-Place operations experiment.Experimental simulation Pick-and-Place operations, moves to another one position B by object by a position A.Because skeleton point orientates position, the right hand centre of the palm as, consider the impact of hand sizes on experiment, circle size and adult palm's sizableness, be namely considered as guiding successfully when end effector falls in circle.
4.1 feasibility checkings
System master system interface operation result display, mechanical arm can respond hand in time relative to the position of hip center and send steering order.
Three-dimensional position curve in the right hand and the mechanical arm tail end pick-up operation process under robot coordinate system, wherein black curve is the position curve of the right hand, grey be the position curve of robot arm end effector.Position curve display system has certain response time, this is mainly because mechanical arm system itself has certain response time on the one hand, performance cost can be produced on the other hand when process smoothing to the positional increment in human hand movement process, smoothing processing more, the response time is also longer.
4.2 discrimination checkings
For the discrimination of verification system, allow 10 classmates under different illumination conditions, carry out 50 experiments respectively.Table 2 is proper operation number of times and accuracy rate under dark surrounds and daylight lamp photoenvironment.Contrast under different illumination conditions.Experiment proves, the hand tracing control mechanical arm system that the present invention proposes has stronger robustness to illumination condition and complex background interference.
Experimental comparison under table 2 different illumination conditions
This invention is using Five-degree-of-freedmanipulator manipulator arm as controlled device, Kinect sensor is utilized to obtain depth image, the joint coordinates of human body is set up by bone tracer technique process depth image, build human hands motion and manipulator motion mapping relations, calculate each joint angle of mechanical arm by inverse kinematics, final realization guides control to the hand of mechanical arm.Experiment shows, this contactless master-slave control strategy is simple to operate, directly perceived, has very strong real-time and interactivity.In order to sophisticated systems, work from now on will add gesture stability, and add remote control module on existing basis, for Kinect prepares in the practical application of teleoperation of robot technical elements.
The above; be only the present invention's preferably embodiment; protection scope of the present invention is not limited thereto; anyly be familiar with those skilled in the art in the technical scope that the present invention discloses, the simple change of the technical scheme that can obtain apparently or equivalence are replaced and are all fallen within the scope of protection of the present invention.
Claims (3)
1. based on a ca libration-free human-computer interactive control system of Kinect, it is characterized in that: mainly contain four parts formations and comprise operator, Kinect sensor, computer software and mechanical arm control system; Wherein operator is in the work space of Kinect, and Kinect is connected by USB data line computer system, and mechanical arm control system is connected with computer system by RS232; Whole system mainly comprises four large functional modules, is human depth's image information acquisition module, skeleton point tracking module, Analysis of Inverse Kinematics module and slave computer communication module respectively; Human depth's image information acquisition module completes primarily of Kinect sensor, and it sends computer system with the speed of 30 frames per second to depth image; Skeleton point tracking module is completed by computer software, and the work mainly completed comprises the smoothing processing of Image semantic classification, human joint points identification and skeleton data; The groundwork of the computation of inverse-kinematics module comprise staff relative to the motion of self hip center and mechanical arm relative to the Mapping and Converting between base motion and mechanical arm Analysis of Inverse Kinematics with solve; Slave computer communication module mainly completes the transmitting-receiving work of the generation of steering order, standardized instruction form and instruction.
2. the ca libration-free human-computer interactive control system based on Kinect according to claim 1, is characterized in that: described depth image adopts the resolution of 640*480, and frame per second is 30f/s; In order to ensure the accuracy of depth distance, operator should to be in about sensor optical axis 57 °, the working range of upper and lower 43 °, employs threshold method filter depth data 1220mm-3810mm within the scope of this simultaneously.
3., based on a ca libration-free man-machine interaction control method of Kinect, it is characterized in that, comprise the following steps:
The first step: human depth's image information acquisition;
Second step: the smoothing processing of Image semantic classification, human joint points identification and skeleton data;
3rd step: staff relative to the motion of self hip center and mechanical arm relative to the Mapping and Converting between base motion and mechanical arm Analysis of Inverse Kinematics with solve;
4th step: the transmitting-receiving of the generation of steering order, standardized instruction form and instruction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410733412.6A CN104570731A (en) | 2014-12-04 | 2014-12-04 | Uncalibrated human-computer interaction control system and method based on Kinect |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410733412.6A CN104570731A (en) | 2014-12-04 | 2014-12-04 | Uncalibrated human-computer interaction control system and method based on Kinect |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104570731A true CN104570731A (en) | 2015-04-29 |
Family
ID=53087101
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410733412.6A Pending CN104570731A (en) | 2014-12-04 | 2014-12-04 | Uncalibrated human-computer interaction control system and method based on Kinect |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104570731A (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105022483A (en) * | 2015-07-08 | 2015-11-04 | 安徽瑞宏信息科技有限公司 | Kinect based public information terminal |
CN105058396A (en) * | 2015-07-31 | 2015-11-18 | 深圳先进技术研究院 | Robot teaching system and control method thereof |
CN105137973A (en) * | 2015-08-21 | 2015-12-09 | 华南理工大学 | Method for robot to intelligently avoid human under man-machine cooperation scene |
CN105232155A (en) * | 2015-09-08 | 2016-01-13 | 微创(上海)医疗机器人有限公司 | Surgical robot adjustment system |
CN105710856A (en) * | 2015-06-01 | 2016-06-29 | 李锦辉 | Remote motion sensing control robot |
CN106272446A (en) * | 2016-08-01 | 2017-01-04 | 纳恩博(北京)科技有限公司 | The method and apparatus of robot motion simulation |
CN106354161A (en) * | 2016-09-26 | 2017-01-25 | 湖南晖龙股份有限公司 | Robot motion path planning method |
CN106514667A (en) * | 2016-12-05 | 2017-03-22 | 北京理工大学 | Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo |
CN106647423A (en) * | 2015-10-28 | 2017-05-10 | 中国移动通信集团公司 | Intelligent following shooting method and intelligent following shooting device |
CN106737685A (en) * | 2017-01-16 | 2017-05-31 | 上海大界机器人科技有限公司 | Manipulator motion system based on computer vision with man-machine real-time, interactive |
CN106846403A (en) * | 2017-01-04 | 2017-06-13 | 北京未动科技有限公司 | The method of hand positioning, device and smart machine in a kind of three dimensions |
CN107253192A (en) * | 2017-05-24 | 2017-10-17 | 湖北众与和智能装备科技有限公司 | It is a kind of based on Kinect without demarcation human-computer interactive control system and method |
CN107363831A (en) * | 2017-06-08 | 2017-11-21 | 中国科学院自动化研究所 | The teleoperation robot control system and method for view-based access control model |
CN108453742A (en) * | 2018-04-24 | 2018-08-28 | 南京理工大学 | Robot man-machine interactive system based on Kinect and method |
CN108594657A (en) * | 2018-04-11 | 2018-09-28 | 福建省德腾智能科技有限公司 | A kind of mechanical arm self-adaptation control method based on neural network |
CN109330494A (en) * | 2018-11-01 | 2019-02-15 | 珠海格力电器股份有限公司 | Sweeping robot control method based on action recognition, system, sweeping robot |
CN109407709A (en) * | 2018-09-25 | 2019-03-01 | 国网天津市电力公司 | A kind of meeting camera shooting automatic tracking system based on Kinect bone track algorithm |
CN109872594A (en) * | 2019-03-20 | 2019-06-11 | 西安医学院第二附属医院 | A kind of 3D context of care simulation numeral learning system |
CN109968310A (en) * | 2019-04-12 | 2019-07-05 | 重庆渝博创智能装备研究院有限公司 | A kind of mechanical arm interaction control method and system |
CN111590560A (en) * | 2020-04-24 | 2020-08-28 | 郭子睿 | Method for remotely operating manipulator through camera |
CN112706158A (en) * | 2019-10-25 | 2021-04-27 | 中国科学院沈阳自动化研究所 | Industrial man-machine interaction system and method based on vision and inertial navigation positioning |
CN115129049A (en) * | 2022-06-17 | 2022-09-30 | 广东工业大学 | Mobile service robot path planning system and method with social awareness |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103170973A (en) * | 2013-03-28 | 2013-06-26 | 上海理工大学 | Man-machine cooperation device and method based on Kinect video camera |
CN103302668A (en) * | 2013-05-22 | 2013-09-18 | 东南大学 | Kinect-based space teleoperation robot control system and method thereof |
CN103903011A (en) * | 2014-04-02 | 2014-07-02 | 重庆邮电大学 | Intelligent wheelchair gesture recognition control method based on image depth information |
-
2014
- 2014-12-04 CN CN201410733412.6A patent/CN104570731A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103170973A (en) * | 2013-03-28 | 2013-06-26 | 上海理工大学 | Man-machine cooperation device and method based on Kinect video camera |
CN103302668A (en) * | 2013-05-22 | 2013-09-18 | 东南大学 | Kinect-based space teleoperation robot control system and method thereof |
CN103903011A (en) * | 2014-04-02 | 2014-07-02 | 重庆邮电大学 | Intelligent wheelchair gesture recognition control method based on image depth information |
Non-Patent Citations (2)
Title |
---|
中国科协首届青年学术年会执行委员会: "《中国科学技术协会首届青年学术年会 论文集》", 30 April 1992 * |
林海波 等: "基于Kinect骨骼信息的机械臂体感交互系统的设计与实现", 《计算机应用于软件》 * |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105710856A (en) * | 2015-06-01 | 2016-06-29 | 李锦辉 | Remote motion sensing control robot |
CN105022483A (en) * | 2015-07-08 | 2015-11-04 | 安徽瑞宏信息科技有限公司 | Kinect based public information terminal |
CN105058396A (en) * | 2015-07-31 | 2015-11-18 | 深圳先进技术研究院 | Robot teaching system and control method thereof |
CN105137973B (en) * | 2015-08-21 | 2017-12-01 | 华南理工大学 | A kind of intelligent robot under man-machine collaboration scene hides mankind's method |
CN105137973A (en) * | 2015-08-21 | 2015-12-09 | 华南理工大学 | Method for robot to intelligently avoid human under man-machine cooperation scene |
CN105232155A (en) * | 2015-09-08 | 2016-01-13 | 微创(上海)医疗机器人有限公司 | Surgical robot adjustment system |
CN105232155B (en) * | 2015-09-08 | 2018-11-09 | 微创(上海)医疗机器人有限公司 | Operating robot adjusts system |
CN106647423A (en) * | 2015-10-28 | 2017-05-10 | 中国移动通信集团公司 | Intelligent following shooting method and intelligent following shooting device |
CN106272446A (en) * | 2016-08-01 | 2017-01-04 | 纳恩博(北京)科技有限公司 | The method and apparatus of robot motion simulation |
CN106272446B (en) * | 2016-08-01 | 2019-02-12 | 纳恩博(北京)科技有限公司 | The method and apparatus of robot motion simulation |
CN106354161A (en) * | 2016-09-26 | 2017-01-25 | 湖南晖龙股份有限公司 | Robot motion path planning method |
CN106514667A (en) * | 2016-12-05 | 2017-03-22 | 北京理工大学 | Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo |
CN106846403A (en) * | 2017-01-04 | 2017-06-13 | 北京未动科技有限公司 | The method of hand positioning, device and smart machine in a kind of three dimensions |
CN106846403B (en) * | 2017-01-04 | 2020-03-27 | 北京未动科技有限公司 | Method and device for positioning hand in three-dimensional space and intelligent equipment |
CN106737685A (en) * | 2017-01-16 | 2017-05-31 | 上海大界机器人科技有限公司 | Manipulator motion system based on computer vision with man-machine real-time, interactive |
CN107253192A (en) * | 2017-05-24 | 2017-10-17 | 湖北众与和智能装备科技有限公司 | It is a kind of based on Kinect without demarcation human-computer interactive control system and method |
CN107363831B (en) * | 2017-06-08 | 2020-01-10 | 中国科学院自动化研究所 | Teleoperation robot control system and method based on vision |
CN107363831A (en) * | 2017-06-08 | 2017-11-21 | 中国科学院自动化研究所 | The teleoperation robot control system and method for view-based access control model |
CN108594657A (en) * | 2018-04-11 | 2018-09-28 | 福建省德腾智能科技有限公司 | A kind of mechanical arm self-adaptation control method based on neural network |
CN108453742A (en) * | 2018-04-24 | 2018-08-28 | 南京理工大学 | Robot man-machine interactive system based on Kinect and method |
CN108453742B (en) * | 2018-04-24 | 2021-06-08 | 南京理工大学 | Kinect-based robot man-machine interaction system and method |
CN109407709A (en) * | 2018-09-25 | 2019-03-01 | 国网天津市电力公司 | A kind of meeting camera shooting automatic tracking system based on Kinect bone track algorithm |
CN109407709B (en) * | 2018-09-25 | 2022-01-18 | 国网天津市电力公司 | Kinect skeleton tracking algorithm-based conference camera shooting automatic tracking system |
CN109330494A (en) * | 2018-11-01 | 2019-02-15 | 珠海格力电器股份有限公司 | Sweeping robot control method based on action recognition, system, sweeping robot |
CN109872594A (en) * | 2019-03-20 | 2019-06-11 | 西安医学院第二附属医院 | A kind of 3D context of care simulation numeral learning system |
CN109872594B (en) * | 2019-03-20 | 2021-06-22 | 西安医学院第二附属医院 | 3D nursing situation simulation digital learning system |
CN109968310A (en) * | 2019-04-12 | 2019-07-05 | 重庆渝博创智能装备研究院有限公司 | A kind of mechanical arm interaction control method and system |
CN112706158A (en) * | 2019-10-25 | 2021-04-27 | 中国科学院沈阳自动化研究所 | Industrial man-machine interaction system and method based on vision and inertial navigation positioning |
CN111590560A (en) * | 2020-04-24 | 2020-08-28 | 郭子睿 | Method for remotely operating manipulator through camera |
CN115129049A (en) * | 2022-06-17 | 2022-09-30 | 广东工业大学 | Mobile service robot path planning system and method with social awareness |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104570731A (en) | Uncalibrated human-computer interaction control system and method based on Kinect | |
US20210205986A1 (en) | Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose | |
CN107253192A (en) | It is a kind of based on Kinect without demarcation human-computer interactive control system and method | |
CN110480634B (en) | Arm guide motion control method for mechanical arm motion control | |
Frank et al. | Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks | |
Asfour et al. | Toward humanoid manipulation in human-centred environments | |
CN103112007B (en) | Based on the man-machine interaction method of hybrid sensor | |
CN108838991A (en) | It is a kind of from main classes people tow-armed robot and its to the tracking operating system of moving target | |
CN105551059A (en) | Power transformation simulation human body motion capturing method based on optical and inertial body feeling data fusion | |
CN105252532A (en) | Method of cooperative flexible attitude control for motion capture robot | |
CN109968310A (en) | A kind of mechanical arm interaction control method and system | |
WO2016193781A1 (en) | Motion control system for a direct drive robot through visual servoing | |
CN107877517A (en) | Motion mapping method based on CyberForce remote operating mechanical arms | |
CN106371442B (en) | A kind of mobile robot control method based on the transformation of tensor product model | |
CN110385694A (en) | Action teaching device, robot system and the robot controller of robot | |
Zhang et al. | A real-time upper-body robot imitation system | |
Skoglund et al. | Programming by demonstration of pick-and-place tasks for industrial manipulators using task primitives | |
Wang et al. | Joining force of human muscular task planning with robot robust and delicate manipulation for programming by demonstration | |
Lambrecht et al. | Markerless gesture-based motion control and programming of industrial robots | |
Gulde et al. | RoPose: CNN-based 2D pose estimation of industrial robots | |
Yin et al. | A systematic review on digital human models in assembly process planning | |
Kryuchkov et al. | Simulation of the «cosmonaut-robot» system interaction on the lunar surface based on methods of machine vision and computer graphics | |
Li et al. | Gesture recognition based on Kinect v2 and leap motion data fusion | |
Li et al. | A dexterous hand-arm teleoperation system based on hand pose estimation and active vision | |
CN109214295B (en) | Gesture recognition method based on data fusion of Kinect v2 and Leap Motion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20150429 |