CN105945947A - Robot writing system based on gesture control and control method of robot writing system - Google Patents

Robot writing system based on gesture control and control method of robot writing system Download PDF

Info

Publication number
CN105945947A
CN105945947A CN201610340833.1A CN201610340833A CN105945947A CN 105945947 A CN105945947 A CN 105945947A CN 201610340833 A CN201610340833 A CN 201610340833A CN 105945947 A CN105945947 A CN 105945947A
Authority
CN
China
Prior art keywords
robot
kinect
computer
scara
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610340833.1A
Other languages
Chinese (zh)
Inventor
董秀成
唐勇
王超
杨邱滟
古世甫
郑海春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xihua University
Original Assignee
Xihua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xihua University filed Critical Xihua University
Priority to CN201610340833.1A priority Critical patent/CN105945947A/en
Publication of CN105945947A publication Critical patent/CN105945947A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40004Window function, only a specific region is analyzed
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40418Presurgical planning, on screen indicate regions to be operated on

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a robot control system and a control method of the robot control system, in particular to a robot writing system based on gesture control and a control method of the robot writing system. Human body pose information of an operator is obtained through Kinect by means of a computer, the computer communicates with a SCARA robot based on the Modbus protocol of the Ethernet, and the scene in which the SCARA robot draws the site is obtained through a camera online. By the adoption of the system, the operator can control movement of the SCARA robot in real time in a non-contact natural man-computer interaction manner to make the SCARA robot follow the palm movement locus of the operator online, so that the robot is made to complete the writing and drawing functions, and the robot can also be controlled to wipe off the drawn locus automatically under the necessary condition.

Description

The robot writing system controlled based on gesture and control method thereof
Technical field
The present invention relates to a kind of robot control system and control method thereof, be specially the machine controlled based on gesture People's writing system and control method thereof.
Background technology
Along with the development of robotics, robot have been widely used for the welding of industrial circle, assembling, The aspects such as carrying, air brushing.The function that robot writes needs the orbiting motion that robotic arm end edge is specified, all Many occasions are required for using this function.The motor control of industrial robot can pass through off-line programing and online programming two The mode of kind realizes.Off-line programing made the track of motion before robot manipulating task, then during operation according to The orbiting motion specified;Online programming is then while operator formulate movement locus, robot just along Current track is drawn.
At present, the formulation of robot motion's track mostly is the mode manually arranging tracing point.This programming mode is taken a lot of work Time-consuming, and the number of tracing point is the most limited, thus track is relatively simple.Additionally, also have by word Track formulates the track that robot writes, but which can only allow the word track of Robot specific font enter Row is drawn, and lacks personalization and customization.
Summary of the invention
For above-mentioned technical problem, the present invention provides robot writing system and the control of a kind of personalization and customization Method processed.
Concrete technical scheme is:
The robot writing system controlled based on gesture, including Kinect, SCARA robot, computer, takes the photograph As head, router, Kinect, photographic head are connected with computer respectively, and computer is connected with router, calculate Machine includes display screen;Computer obtains the human body attitude information of operator by Kinect, and by based on ether The Modbus agreement of net communicates with SCARA robot, uses photographic head to obtain SCARA machine online The scene at the device people scene of drawing.
Also including display, display is connected with computer, observes images themselves, robot for operator current Motion conditions, palm track drafting and robot drawing track.
The control method of robot writing system controlled based on gesture, after program starts, first to Kinect and SCARA robot initializes, and opens the function that Kinect colour information is extracted and bone information extracts, even Meet the SCARA robot specifying IP, and Computer display interface is initialized;
Then, enter the overall circulation of program, in cyclic process, determined whether erasing, mouse write, Track drafting file, preservation track mouse event, if having, performing corresponding subprocess, otherwise calling Kinect Attempt obtaining up-to-date human body attitude;
The information got from Kinect judges currently whether have user attempting gesture manipulation, if having, to carry out pen Draw parsing, trajectory planning and interface record to process;
When robot motion is controlled by needs, program can be called SCARA robot and control submodule, Thus complete corresponding action.
Kinect is the said three-dimensional body propagated sensation sensor that Microsoft issues, and it can obtain the image of environment, the degree of depth, audio frequency Etc. information, and there is the functions such as action extraction, image recognition, speech recognition.User is allowed to pass through body-sensing, voice Carry out man-machine interaction etc. mode, break away from the constraint that conventional contact is handed over.Kinect relies on three " eyes " to feel Knowing the three-dimensional image information of environment, wherein colour imagery shot is used for obtaining environment coloured image;Infrared emission camera lens Harmless iraser is launched in a particular manner to environment;Infrared camera is used for catching the infrared figure in environment Picture, thus obtain environment depth information.Based on depth information, it is main that Kinect can accurately obtain human body The locus in joint, thus obtain the attitude information of human body.
Kinect utilizes depth image to obtain skeleton information, and extraction process substantially divides 4 steps: Human body target detection and segmentation, body part identification, centroid calculation, skeleton model matching.First detect It is also split from background by human body target in the visual field, thus is only comprised the depth image of human body, Kinect employs the method for random forest and realizes.Second step then calculates each pixel in human depth's figure and belongs to respectively In head, neck, hands, etc. the probability at position, then each pixel is classified, and then identifies human depth Each body part in figure, it is achieved method is similar with the first step.3rd step calculates the barycenter of each body part, Obtain the three-dimensional coordinate at each position.Final step then carries out smooth output and other special places to each position coordinate Reason, and it is mated with the skeleton pattern represented based on joint coordinates, finally give and represented by joint coordinates Bone information
Utilize coloured image can only obtain human body two-dimensional position to the method obtaining human body attitude, and be very easy to be subject to Illumination, background, the impact of clothes color, accuracy and robustness are the most poor.Owing to depth image can reflect mirror The locus of each pixel in the head visual field, therefore utilize depth image to can get human body and the three of multiple major joint Dimension coordinate, and do not affected by factors such as ambient lighting and clothes colors.
The robot writing system controlled based on human body gesture that the present invention provides, utilizes Microsoft's body-sensing sensor Kinect, obtains human body attitude information in a non-contact manner;The position utilizing human body two to take on constructs human body Palm drawing area so that adapt to the manipulator of the different bodily form;Come in the relative position in this region according to people's palm Control the track drafting of robot;Open, by palm, the state of holding and control the action of starting writing and start to write of robot;And According to specification error scope, palm track drafting is processed, and then improve the control efficiency of robot.Should System makes manipulation personnel can control plane joint type machine in real time with contactless natural man-machine interaction mode The motion of people (SCARA), allows it follow the palm movement locus of manipulation personnel online, so that it completes to write The function of word drawing, in the case of necessary, moreover it is possible to control the track that robot auto-erasing has been drawn.
Accompanying drawing explanation
Fig. 1 is the system structure schematic diagram of the present invention;
Fig. 2 is the SCARA robot drawing blank scheme of installation of embodiment;
Fig. 3 is the layout of the system of embodiment;
Fig. 4 is the schematic diagram of the palm drawing area of embodiment;
Fig. 5 is the method for planning track schematic diagram of embodiment;
Fig. 6 is the control flow chart of the present invention.
Detailed description of the invention
It is described in conjunction with the embodiments the detailed description of the invention of the present invention.
The present embodiment controlled device is SCARA robot, and it is the industrial robot of a kind of circular cylindrical coordinate type, There are four degree of freedom, rely on three and realize for cradle head and a translation joint.Because three rotate and close That saves is parallel to Z axis, and translation joint can realize end and move linearly along Z axis, therefore SCARA robot The end translation along X, Y, Z axis, and the rotation along Z axis can be controlled.The rotating range in the first two joint Being respectively ± 130 ° and ± 140 °, two brachiums are 250mm, and the stroke along Z axis is 200mm, last rotation The slewing area turning joint is ± 360 °.The operation interval of SCARA robot is made up of multiple circular arcs.SCARA Robot uses the Body Controller with PLC as core.Body Controller completes the real-time of joint of robot variable Control and feedback, and can detect and early warning fault.Meanwhile, it can be communicated with host computer by Ethernet, Thus realize computer to the management of robot and control.
Owing to Kinect can obtain coloured image and the depth information of environment simultaneously, can be relatively hence with Kinect Obtain human body attitude information and palm state easily.
As it is shown in figure 1, based on gesture control robot writing system, including Kinect, SCARA robot, Computer, photographic head, router, Kinect, photographic head be connected with computer respectively, computer and router Connecting, computer includes display screen;Computer obtains the human body attitude information of operator by Kinect, and leads to Cross Modbus agreement based on Ethernet to communicate with SCARA robot, use photographic head to obtain online Take the scene that SCARA robot drawing is on-the-spot.Photographic head is USB camera.
Also including display, display is connected with computer, observes images themselves, robot for operator current Motion conditions, palm track drafting and robot drawing track.Display is LCD TV.
In system, SCARA robot drawing plane is blank, and drawing instrument is white board marker.As in figure 2 it is shown, it is white Plate pen is installed on immediately below SCARA robot Z axis, and its adjacent be to wipe track arrangement for realizing.Wipe It is power resources except using air pressure, is controlled the lifting of whiteboard eraser by air-operated solenoid valve, thus realize drawing mould Formula and the program control switching of erasing mode.
The layout of system is as it is shown on figure 3, Kinect is positioned over the surface of LCD TV, and operator stand position Putting the distance of Kinect between 1.5m 2.1m, blank is in the underface of SCARA robot end, prison USB camera depending on SCARA robot drawing situation is arranged on the oblique upper of blank.
The two shoulder positions according to Kinect human body attitude information determine an effective drawing area of palm, by human body Palm is the drawing area of robot drawing district's Linear Mapping, and the relative position of human body palm then corresponds to SCARA robot end's paintbrush position in drawing area.The schematic diagram of palm drawing area is as shown in Figure 4. In figure, N is neck location, and Q, S are dual-shoulder position, and P is palm position, S and P is on movable that hand arm. The initial point of palm drawing area is defined as at shoulder S, and the size of drawing area is according to shoulder breadth i.e. 2 distances of Q, S Determine, to adapt to the people of multiple build.
When controlling SCARA robot and writing, tracing point the target location represented is sent to by program SCARA robot, when it reaches this tracing point, retransmits the order moving to next tracing point.From Sending a command to motion complete, the control of each tracing point will expend certain time, time-consuming the most all than The cycle of Kinect gesture generation tracing point to be grown.According to putting the control mode of motion one by one, SCARA machine Sizable time delay will be produced between the motion of people and the motion of people's palm.Therefore, the present embodiment have employed a kind of rail Mark planing method so that between SCARA robot and people's palm, track is the most similar, does not haves again bigger Time delay.Method for planning track is as it is shown in figure 5, the most all soft dots and black circle are primary stroke rail Mark, and the tracing point being used for controlling SCARA robot after planning is black circle.In figure, e is the maximum arranged Range of error, represents initial trace point track ultimate range after planning.In trajectory planning algorithm, it is assumed that A point P, the then next output point Q of search are currently cooked up so that all initial trace between P and Q Point is both less than e according to the distance of PQ line segment.It can be seen that compare initial trace, the tracing point after process Much smaller number, so can be greatly shortened the SCARA motion planning and robot control time, improves echokinesis Speed.Put down by the efficiency and this contradiction of precision suitably arranging e, SCARA motion planning and robot control Weighing apparatus.
The major function that the robot writing system of the present invention is to be realized is to allow user be controlled by gesture The action of SCARA robot so that it is complete to simulate the function that people writes.In addition to the interactive mode that gesture controls, User can also be used with mouse operation and control or uses trail file to control the action of writing of SCARA robot.Meanwhile, The function also needed to has erasing track, preserves trail file, board supervision, template word setting, interface video The functions such as record, record playback.
Control method based on the robot writing system that gesture controls, as shown in Figure 6, after program starts, First Kinect and SCARA robot is initialized, open the extraction of Kinect colour information and bone information carries The function taken, connects the SCARA robot specifying IP, and initializes Computer display interface;
Then, enter the overall circulation of program, in cyclic process, determined whether erasing, mouse write, Track drafting file, preservation track mouse event, if having, performing corresponding subprocess, otherwise calling Kinect Attempt obtaining up-to-date human body attitude;
The information got from Kinect judges currently whether have user attempting gesture manipulation, if having, to carry out pen Draw parsing, trajectory planning and interface record to process;
When robot motion is controlled by needs, program can be called SCARA robot and control submodule, Thus complete corresponding action.
The functions such as the control of SCARA joint of robot degree of freedom, position feedback, air-operated solenoid valve IO control are all Realized by robot body controller, and computer by set up Modbus agreement on Ethernet come with SCARA robot body controller communicates.Utilize Modbus agreement, run on the program on computer permissible Control action and the state of management robot of robot.On the computer of the present embodiment, software is to SCARA The Modbus communications commands such as robot end's position control, position detection, the control of electromagnetic valve I O are packaged, Call for each functional module.

Claims (3)

1. based on gesture control robot writing system, it is characterised in that include Kinect, SCARA machine Device people, computer, photographic head, router, Kinect, photographic head be connected with computer respectively, computer with Router connects, and computer includes display screen;Computer obtains the human body attitude information of operator by Kinect, And communicated with SCARA robot by Modbus agreement based on Ethernet, use photographic head to exist Line obtains the scene that SCARA robot drawing is on-the-spot.
The robot writing system controlled based on gesture the most according to claim 1, it is characterised in that also Including display, display is connected with computer, observes images themselves, robot current kinetic feelings for operator Condition, palm track drafting and robot drawing track.
The control method of the robot writing system controlled based on gesture the most according to claim 1 and 2, its It is characterised by, including procedure below: after program starts, first Kinect and SCARA robot is carried out initially Change, open the function that Kinect colour information is extracted and bone information extracts, connect the SCARA machine specifying IP People, and Computer display interface is initialized;
Then, enter the overall circulation of program, in cyclic process, determined whether erasing, mouse write, Track drafting file, preservation track mouse event, if having, performing corresponding subprocess, otherwise calling Kinect Attempt obtaining up-to-date human body attitude;
The information got from Kinect judges currently whether have user attempting gesture manipulation, if having, to carry out pen Draw parsing, trajectory planning and interface record to process;
When robot motion is controlled by needs, program can be called SCARA robot and control submodule, Thus complete corresponding action.
CN201610340833.1A 2016-05-20 2016-05-20 Robot writing system based on gesture control and control method of robot writing system Pending CN105945947A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610340833.1A CN105945947A (en) 2016-05-20 2016-05-20 Robot writing system based on gesture control and control method of robot writing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610340833.1A CN105945947A (en) 2016-05-20 2016-05-20 Robot writing system based on gesture control and control method of robot writing system

Publications (1)

Publication Number Publication Date
CN105945947A true CN105945947A (en) 2016-09-21

Family

ID=56910158

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610340833.1A Pending CN105945947A (en) 2016-05-20 2016-05-20 Robot writing system based on gesture control and control method of robot writing system

Country Status (1)

Country Link
CN (1) CN105945947A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485765A (en) * 2016-10-13 2017-03-08 中国科学院半导体研究所 A kind of method of automatic description face stick figure
CN106514667A (en) * 2016-12-05 2017-03-22 北京理工大学 Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo
CN106570920A (en) * 2016-11-02 2017-04-19 邹操 Art display system based on plane scanning technology and method thereof
CN106683148A (en) * 2016-12-08 2017-05-17 邹操 Fractal graph generating and displaying system and method
CN107930161A (en) * 2017-11-16 2018-04-20 大连交通大学 A kind of shadow puppet performance equipment
CN108460369A (en) * 2018-04-04 2018-08-28 南京阿凡达机器人科技有限公司 A kind of drawing practice and system based on machine vision
CN108564063A (en) * 2018-04-27 2018-09-21 北京华捷艾米科技有限公司 Centre of the palm localization method based on depth information and system
CN108568822A (en) * 2018-06-27 2018-09-25 西华大学 Heterogeneous remote control system based on multiple robots
CN108961414A (en) * 2017-05-19 2018-12-07 中兴通讯股份有限公司 A kind of display control method and device
CN113246131A (en) * 2021-05-27 2021-08-13 广东智源机器人科技有限公司 Motion capture method and device, electronic equipment and mechanical arm control system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226387A (en) * 2013-04-07 2013-07-31 华南理工大学 Video fingertip positioning method based on Kinect
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
KR20150097049A (en) * 2014-02-17 2015-08-26 경북대학교 산학협력단 self-serving robot system using of natural UI
CN105252532A (en) * 2015-11-24 2016-01-20 山东大学 Method of cooperative flexible attitude control for motion capture robot
CN105291138A (en) * 2015-11-26 2016-02-03 华南理工大学 Visual feedback platform improving virtual reality immersion degree

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226387A (en) * 2013-04-07 2013-07-31 华南理工大学 Video fingertip positioning method based on Kinect
KR20150097049A (en) * 2014-02-17 2015-08-26 경북대학교 산학협력단 self-serving robot system using of natural UI
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN105252532A (en) * 2015-11-24 2016-01-20 山东大学 Method of cooperative flexible attitude control for motion capture robot
CN105291138A (en) * 2015-11-26 2016-02-03 华南理工大学 Visual feedback platform improving virtual reality immersion degree

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
C PILLAJO, JE SIERRA: "Human Machine Interface HMI using Kinect sensor to control a SCARA Robot", 《COMMUNICATION AND COMPUTING》 *
景兴碧,万仁明: "软笔书法机器人", 《机器人技术与应用》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485765B (en) * 2016-10-13 2019-09-03 中国科学院半导体研究所 A kind of method of automatic description face stick figure
CN106485765A (en) * 2016-10-13 2017-03-08 中国科学院半导体研究所 A kind of method of automatic description face stick figure
CN106570920A (en) * 2016-11-02 2017-04-19 邹操 Art display system based on plane scanning technology and method thereof
CN106514667A (en) * 2016-12-05 2017-03-22 北京理工大学 Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo
CN106683148A (en) * 2016-12-08 2017-05-17 邹操 Fractal graph generating and displaying system and method
CN108961414A (en) * 2017-05-19 2018-12-07 中兴通讯股份有限公司 A kind of display control method and device
CN107930161B (en) * 2017-11-16 2019-08-13 大连交通大学 A kind of shadow puppet performance equipment
CN107930161A (en) * 2017-11-16 2018-04-20 大连交通大学 A kind of shadow puppet performance equipment
CN108460369A (en) * 2018-04-04 2018-08-28 南京阿凡达机器人科技有限公司 A kind of drawing practice and system based on machine vision
WO2019192149A1 (en) * 2018-04-04 2019-10-10 南京阿凡达机器人科技有限公司 Machine-vision-based drawing method and system
CN108564063A (en) * 2018-04-27 2018-09-21 北京华捷艾米科技有限公司 Centre of the palm localization method based on depth information and system
CN108568822A (en) * 2018-06-27 2018-09-25 西华大学 Heterogeneous remote control system based on multiple robots
CN113246131A (en) * 2021-05-27 2021-08-13 广东智源机器人科技有限公司 Motion capture method and device, electronic equipment and mechanical arm control system

Similar Documents

Publication Publication Date Title
CN105945947A (en) Robot writing system based on gesture control and control method of robot writing system
AU2020201554B2 (en) System and method for robot teaching based on RGB-D images and teach pendant
CN110405730B (en) Human-computer interaction mechanical arm teaching system based on RGB-D image
CN102638653B (en) Automatic face tracing method on basis of Kinect
US7353082B2 (en) Method and a system for programming an industrial robot
CN108858195A (en) A kind of Triple distribution control system of biped robot
JP2008530661A (en) System and method for a gesture-based control system
CN110142770B (en) Robot teaching system and method based on head-mounted display device
CN106313049A (en) Somatosensory control system and control method for apery mechanical arm
CN103413487B (en) A kind of technology for assembling transformers analogue system
CN104457566A (en) Spatial positioning method not needing teaching robot system
CN106041928A (en) Robot job task generation method based on workpiece model
CN108202316A (en) A kind of crusing robot and control method of automatic switch cabinet door
CN107214700A (en) A kind of robot autonomous patrol method
Taylor et al. Visual perception and robotic manipulation: 3D object recognition, tracking and hand-eye coordination
CN109199240A (en) A kind of sweeping robot control method and system based on gesture control
CN115469576B (en) Teleoperation system based on human-mechanical arm heterogeneous motion space hybrid mapping
CN102830798A (en) Mark-free hand tracking method of single-arm robot based on Kinect
CN106468917A (en) A kind of tangible live real-time video image remotely assume exchange method and system
CN113103230A (en) Human-computer interaction system and method based on remote operation of treatment robot
JP7261306B2 (en) Information processing device, setting device, image recognition system, robot system, setting method, learning device, and learned model generation method
US20200398420A1 (en) Robot teaching device and robot system
CN113211447A (en) Mechanical arm real-time perception planning method and system based on bidirectional RRT algorithm
CN111113414B (en) Robot three-dimensional space scale prompting method and system based on screen identification
Liang et al. Bare-hand depth perception used in augmented reality assembly supporting

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20160921

RJ01 Rejection of invention patent application after publication