CN102955563A - Robot control system and method - Google Patents

Robot control system and method Download PDF

Info

Publication number
CN102955563A
CN102955563A CN 201110245701 CN201110245701A CN102955563A CN 102955563 A CN102955563 A CN 102955563A CN 201110245701 CN201110245701 CN 201110245701 CN 201110245701 A CN201110245701 A CN 201110245701A CN 102955563 A CN102955563 A CN 102955563A
Authority
CN
China
Prior art keywords
effector
robot
movable position
image
dimensional image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 201110245701
Other languages
Chinese (zh)
Inventor
李后贤
李章荣
罗治平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hongfujin Precision Industry Shenzhen Co Ltd
Hon Hai Precision Industry Co Ltd
Original Assignee
Hongfujin Precision Industry Shenzhen Co Ltd
Hon Hai Precision Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hongfujin Precision Industry Shenzhen Co Ltd, Hon Hai Precision Industry Co Ltd filed Critical Hongfujin Precision Industry Shenzhen Co Ltd
Priority to CN 201110245701 priority Critical patent/CN102955563A/en
Publication of CN102955563A publication Critical patent/CN102955563A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

A robot control system comprises an image shooting module, a relation building module, an image analyzing module and a control module. The image shooting module is used for controlling a time-of-flight camera to continuously shoot a controller to obtain a real-time three-dimensional image of the controller. The relation building module is used for calibrating N moving parts of the controller from the three-dimensional image of the controller according to N moving parts of a robot and building relations of one-to-one correspondence between the N moving parts of the controller and the N moving parts of the robot. The image analyzing module is used for performing image analysis to the real-time three-dimensional image of the controller to acquire mobile data of each moving part of the controller. The control module sends an instruction to the robot through a network according to the mobile data of each moving part of the controller to control each moving part of the robot to execute corresponding action. The invention further provides a robot control method.

Description

Robot control system and method
Technical field
The present invention relates to a kind of robot control system and method.
Background technology
Along with the development of science and technology, can replace human work's robot to be used in the various fields.For example, be engaged in the manufacturing factory, a lot of industrial robots are used in production line.In other application, such as building industry, oil drilling, ore extraction, undersea search, harmful toxic matter cleaning, search and rescue, medical science and military field etc., also often can use robot.At present, many robots use the specialized control equipment of robot by the effector, or wear the special device that possesses inducing function, action behavior to robot is controlled, this control method needs the effector that specialized control equipment is carried out deep being familiar with, seem comparatively loaded down with trivial details, not directly perceived.
Summary of the invention
In view of above content, be necessary to provide a kind of robot control system, be applied in the main control equipment, this main control equipment connects a degree of depth video camera, and is connected with a robot communication by network.This system comprises: image acquisition module is used for the controlling depth video camera and continues the effector is taken, to obtain effector's real-time three-dimensional image; Relation is set up module, be used for N movable position according to robot, tie up N the movable position that calibrates the effector the image from effector's a Zhang San, then one to one relation is set up with N movable position of robot respectively in this effector's N movable position; The image analysing computer module, be used for effector's real-time three-dimensional image is carried out image analysing computer, the Mobile data at each movable position of the effector who obtains demarcating, this Mobile data comprise moving direction and the displacement on this moving direction at each movable position of effector; And control module, according to the Mobile data at each movable position of effector, to robot, carry out with effector respective activity position identical action with each movable position of controlling this robot by the network sending controling instruction.
Also being necessary provides a kind of robot control method, is applied in the main control equipment, and this main control equipment connects a degree of depth video camera, and is connected with a robot communication by network.The method comprises: image capturing step, controlling depth video camera continue the effector is taken, to obtain effector's real-time three-dimensional image; Concern establishment step, according to the N of robot movable position, tie up N the movable position that calibrates the effector the image from effector's a Zhang San, then one to one relation is set up with N movable position of robot respectively in this effector's N movable position; The image analysing computer step, real-time three-dimensional image to the effector carries out image analysing computer, the Mobile data at each movable position of the effector who obtains demarcating, this Mobile data comprise moving direction and the displacement on this moving direction at each movable position of effector; And the control step, according to the Mobile data at each movable position of effector, to robot, carry out with effector respective activity position identical action with each movable position of controlling this robot by the network sending controling instruction.
Compared to prior art, implement robot control system of the present invention and method, can make the effector can be with intuition more, mode is controlled operation to robot easily.
Description of drawings
Fig. 1 is the running environment schematic diagram of robot control system preferred embodiment of the present invention.
Fig. 2 is the Organization Chart of main control equipment preferred embodiment among Fig. 1.
Fig. 3 is the schematic diagram of a human body three-dimensional image.
Fig. 4 is the schematic diagram of in the preferred embodiment of the present invention the motive position of effector's motive position and robot being set up corresponding relation.
Fig. 5 is the process flow diagram of robot control method preferred embodiment of the present invention.
The main element symbol description
Main control equipment 1
Degree of depth video camera 2
Network 3
Robot control system 10
Storer 11
Processor 12
Image acquisition module 101
Relation is set up module 102
The image analysing computer module 103
Control module 104
The effector M0
Robot M1
Following embodiment further specifies the present invention in connection with above-mentioned accompanying drawing.
Embodiment
As shown in Figure 1, be the running environment schematic diagram of robot control system preferred embodiment of the present invention.This robot control system 10 is installed in the main control equipment 1.Main control equipment 1 is connected with a degree of depth video camera 2, and is connected with robot M1 communication by network 3, to carry out real-time Communication for Power with the M1 of this robot.Described robot control system 10 is obtained the three-dimensional real-time imaging of effector M0 by this degree of depth video camera 2, and analyzes this three-dimensional real-time imaging with the Mobile data of controlled person M0.Then, this robot control system 10 passes through described network 3 sending controling instructions to the M1 of robot according to this Mobile data of obtaining, and then control M1 carries out the action identical with effector M0.Described network 3 can be cable network or wireless network.
Should be noted that the described M1 of robot can work in the range of vision of described effector M0, so that effector M0 can make appropriate action according to the environmental aspect of the robot current position of M1, and then the M1 of robot is controlled accurately.In addition, when other remote location outside effector M0 range of vision of the M1 of robot, effector M0 can obtain by other utility appliance the video image data of robot M1 place environment, makes appropriate action according to this video image data, and then the M1 of robot is controlled.Further, if the described M1 of robot possesses visual performance, can initiatively obtain described video image data by the M1 of this robot, and return to described main control equipment 1 by network 3, for effector M0 reference.
Described degree of depth video camera 2 is the camera device of a kind of use degree of depth sensing (depth-sensing) technology, by active illumination special scenes is launched light beam, and calculate the distance between the object in this degree of depth video camera 2 and the scene by mistiming or the phase differential that calculates folded light beam, and then obtain one group of 3-dimensional image that comprises depth of view information.This degree of depth video camera 2 can be time flight (Time of Flight, TOF) video camera.
As shown in Figure 2, be the Organization Chart of described main control equipment 1.This main control equipment 1 comprises storer 11, processor 12 and described robot control system 10.This robot control system 10 comprises image acquisition module 101, concerns and set up module 102, image analysing computer module 103 and control module 104.This robot control system 10 can be stored in the storer 11, and is controlled the execution of these robot control systems 10 by processor 12.In the present embodiment, this main control equipment 1 can be that computing machine, server or other possess that 3-dimensional image is processed and the opertaing device of the abilities such as data transmission.
Described image acquisition module 101 is used for controlling depth video camera 2 to be continued effector M0 is taken, to obtain the real-time three-dimensional image of effector M0.This 3-dimensional image comprises the range information between effector M0 parts of body and the degree of depth video camera 2.For example shown in Figure 3, be the schematic diagram of a captured human body three-dimensional image of degree of depth video camera 2, comprise the body image picture on the XY direction in this human body three-dimensional image, and the range information between partes corporis humani position and the degree of depth video camera 2 on the Z direction.
Described relation is set up module 102 and is used for the individual movable position according to the N of the M1 of robot, such as the S0 ' among Fig. 4, S1 ', S2 ', S3 ', S4 ', S5 ' and S6 ', tie up N the movable position that calibrates effector M0 the image from the Zhang San of effector M0, such as the S0 among Fig. 4, S1, S2, S3, S4, S5 and S6, then one to one relation is set up with N the movable position of the M1 of robot respectively in N the movable position of this effector M0, corresponding with S0 such as S0 ', S1 ' is corresponding with S1, and S6 ' is corresponding with S6.After relation is set up one to one at the movable position of the effector M0 of the movable position of the M1 of robot and demarcation, in follow-up control procedure, corresponding action will be carried out according to the action at the corresponding movable position of effector M0 in the movable position of each of the M1 of robot.In addition, relation is set up the characteristic image that module 102 also is used for extracting from this 3-dimensional image each movable position of the effector M0 that demarcates, and the characteristic image that this extracts is stored in the storer 11.
Described image analysing computer module 103 is used for the real-time three-dimensional image of the effector M0 of picked-up is carried out image analysing computer, the Mobile data at each movable position of the effector M0 that obtains demarcating.This Mobile data comprises moving direction and the data such as displacement on this moving direction at each movable position of effector M0.
Particularly, in the present embodiment, this image analysing computer module 103 can be analyzed present frame 3-dimensional image and the former frame 3-dimensional image of the effector M0 that absorbs, calculates moving direction and the displacement on this moving direction at each movable position of effector M0 by the positional information (such as three-dimensional coordinate information) of characteristic image in this two frames 3-dimensional image at each movable position of comparison effector M0.For example, movable position S1 positional information on the Z coordinate direction in coordinate system XYZ shown in Figure 3 by effector M0 in more described present frame and the former frame 3-dimensional image, can calculate the distance that this activity position S1 advances or retreats, and the positional information of more movable position S1 on the X and Y coordinates direction, can calculate this activity position S1 up and down or the distance of move left and right.
In addition, this image analysing computer module 103 also can connect with intermediary's analysis software (for example openNI), the real-time three-dimensional image of described picked-up is inputted this intermediary's analysis software, the real-time three-dimensional image of this input is analyzed the Mobile data at each movable position of direct controlled person M0 by this intermediary's analysis software.Described OpenNI is the abbreviation of Open Natural Interaction, is called again open naturally operation, and it provides the interface of a standard, is used for user's naturally operation (such as voice, gesture, body action etc.) is analyzed and judged.
The Mobile data that described control module 104 is used for according to each movable position of effector M0 to the M1 of robot, is carried out corresponding action with each movable position of controlling the M1 of this robot by described network 3 sending controling instructions.In the present embodiment, this steering order comprises the Mobile data at each movable position of effector M0.After the described M1 of robot receives described steering order, can carry out corresponding action according to each movable position of this steering order control M1 by the brake system that the M1 of this robot includes.
As shown in Figure 5, be the process flow diagram of robot control method preferred embodiment of the present invention.
Step S01, described image acquisition module 101 controlling depth video cameras 2 continue effector M0 is taken, to obtain the real-time three-dimensional image of effector M0.This 3-dimensional image comprises the range information between effector M0 parts of body and the degree of depth video camera 2.
Step S02, described relation is set up module 102 according to N the movable position of the M1 of robot, such as the S0 ' among Fig. 4, S1 ', S2 ', S3 ', S4 ', S5 ' and S6 ', tie up N the movable position that calibrates effector M0 the image from the Zhang San of effector M0, such as the S0 among Fig. 4, S1, S2, S3, S4, S5 and S6, then one to one relation is set up with N the movable position of the M1 of robot respectively in N the movable position of this effector M0.Further, this relation is set up module 102 extracts each movable position of the effector M0 that demarcates from this 3-dimensional image characteristic image, and the characteristic image that this extracts is stored in the storer 11.
Step S03, the real-time three-dimensional image of 103 couples of effector M0 of described image analysing computer module carries out image analysing computer, the Mobile data at each movable position of the effector M0 that obtains demarcating.This Mobile data comprises moving direction and the data such as displacement on this moving direction at each movable position of effector M0.
Step S04, described control module 104 to the M1 of robot, is carried out corresponding action with each movable position of controlling the M1 of this robot by described network 3 sending controling instructions according to the Mobile data at each movable position of effector M0.In the present embodiment, this steering order comprises the Mobile data at each movable position of effector M0.After the described M1 of robot receives this steering order, can carry out the action identical with effector M0 according to this steering order control M1 by the brake system that the M1 of this robot includes.
Above embodiment is only unrestricted in order to technical scheme of the present invention to be described, although with reference to preferred embodiment the present invention is had been described in detail, those of ordinary skill in the art is to be understood that, can make amendment or be equal to replacement technical scheme of the present invention, and not break away from the spirit and scope of technical solution of the present invention.

Claims (10)

1. a robot control method is applied in the main control equipment, and this main control equipment connects a degree of depth video camera, and is connected with a robot communication by network, it is characterized in that, the method comprises:
Image capturing step, controlling depth video camera continue the effector is taken, to obtain effector's real-time three-dimensional image;
Concern establishment step, according to the N of robot movable position, tie up N the movable position that calibrates the effector the image from effector's a Zhang San, then one to one relation is set up with N movable position of robot respectively in this effector's N movable position;
The image analysing computer step, real-time three-dimensional image to the effector carries out image analysing computer, the Mobile data at each movable position of the effector who obtains demarcating, this Mobile data comprise moving direction and the displacement on this moving direction at each movable position of effector; And
The control step according to the Mobile data at each movable position of effector, to robot, is carried out with effector respective activity position identical action with each movable position of controlling this robot by the network sending controling instruction.
2. robot control method as claimed in claim 1 is characterized in that, the described establishment step that concerns also comprises:
From described 3-dimensional image, extract the characteristic image at each movable position of the effector who demarcates; And
This characteristic image that extracts is stored in the storer of main control equipment.
3. robot control method as claimed in claim 2 is characterized in that, described image analysing computer step comprises:
The 3-dimensional image of effector's present frame of absorbing and the 3-dimensional image of former frame are analyzed, and the positional information calculation of characteristic image in this two frames 3-dimensional image by each movable position of comparison effector goes out described Mobile data.
4. robot control method as claimed in claim 1 is characterized in that, described image analysing computer step comprises:
Be connected with intermediary's analysis software; And
The real-time three-dimensional image that absorbs is inputted this intermediary's analysis software, by this intermediary's analysis software the real-time three-dimensional image of this input is analyzed, the Mobile data at each movable position of controlled person.
5. robot control method as claimed in claim 1, it is characterized in that, described steering order comprises described Mobile data, after described robot receives this steering order, carry out the action identical with effector's respective activity position by the brake system that this robot includes according to each movable position of this steering order control.
6. a robot control system is applied in the main control equipment, and this main control equipment connects a degree of depth video camera, and is connected with a robot communication by network, it is characterized in that, this system comprises:
Image acquisition module is used for the controlling depth video camera and continues the effector is taken, to obtain effector's real-time three-dimensional image;
Relation is set up module, be used for N movable position according to robot, tie up N the movable position that calibrates the effector the image from effector's a Zhang San, then one to one relation is set up with N movable position of robot respectively in this effector's N movable position;
The image analysing computer module, be used for effector's real-time three-dimensional image is carried out image analysing computer, the Mobile data at each movable position of the effector who obtains demarcating, this Mobile data comprise moving direction and the displacement on this moving direction at each movable position of effector; And
Control module according to the Mobile data at each movable position of effector, to robot, is carried out with effector respective activity position identical action with each movable position of controlling this robot by the network sending controling instruction.
7. robot control system as claimed in claim 6, it is characterized in that, described relation is set up module and also is used for extracting the characteristic image at each movable position of the effector who demarcates from described 3-dimensional image, and the characteristic image that this extracts is stored in the storer of main control equipment.
8. robot control system as claimed in claim 7 is characterized in that, described image analysing computer module obtains described Mobile data by following steps:
The 3-dimensional image of effector's present frame of absorbing and the 3-dimensional image of former frame are analyzed, and the positional information calculation of characteristic image in this two frames 3-dimensional image by each movable position of comparison effector goes out described Mobile data.
9. robot control system as claimed in claim 6 is characterized in that, described image analysing computer module obtains described Mobile data by following steps:
Be connected with intermediary's analysis software; And
The real-time three-dimensional image that absorbs is inputted this intermediary's analysis software, by this intermediary's analysis software the real-time three-dimensional image of this input is analyzed, the Mobile data at each movable position of controlled person.
10. robot control system as claimed in claim 6, it is characterized in that, described steering order comprises described Mobile data, after described robot receives this steering order, carry out the action identical with effector's respective activity position by the brake system that this robot includes according to each movable position of this steering order control.
CN 201110245701 2011-08-25 2011-08-25 Robot control system and method Pending CN102955563A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110245701 CN102955563A (en) 2011-08-25 2011-08-25 Robot control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110245701 CN102955563A (en) 2011-08-25 2011-08-25 Robot control system and method

Publications (1)

Publication Number Publication Date
CN102955563A true CN102955563A (en) 2013-03-06

Family

ID=47764446

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110245701 Pending CN102955563A (en) 2011-08-25 2011-08-25 Robot control system and method

Country Status (1)

Country Link
CN (1) CN102955563A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714322A (en) * 2013-12-26 2014-04-09 四川虹欧显示器件有限公司 Real-time gesture recognition method and device
CN104102220A (en) * 2014-07-11 2014-10-15 上海裕山信息科技有限公司 Guide robot for monitoring human body position and control method thereof
CN105334852A (en) * 2014-08-11 2016-02-17 纬创资通股份有限公司 Interference system and computer system of sweeping robot
CN105739701A (en) * 2016-01-31 2016-07-06 盛禾东林(厦门)文创科技有限公司 Module of instantaneous generation 3D (Three-Dimensional) model device control matrix structure, and operating system thereof
CN108459707A (en) * 2018-01-26 2018-08-28 上海萌王智能科技有限公司 It is a kind of using intelligent terminal identification maneuver and the system that controls robot

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714322A (en) * 2013-12-26 2014-04-09 四川虹欧显示器件有限公司 Real-time gesture recognition method and device
CN104102220A (en) * 2014-07-11 2014-10-15 上海裕山信息科技有限公司 Guide robot for monitoring human body position and control method thereof
CN105334852A (en) * 2014-08-11 2016-02-17 纬创资通股份有限公司 Interference system and computer system of sweeping robot
CN105334852B (en) * 2014-08-11 2018-04-20 纬创资通股份有限公司 Interference system and computer system of sweeping robot
CN105739701A (en) * 2016-01-31 2016-07-06 盛禾东林(厦门)文创科技有限公司 Module of instantaneous generation 3D (Three-Dimensional) model device control matrix structure, and operating system thereof
CN105739701B (en) * 2016-01-31 2018-08-21 盛禾东林(厦门)文创科技有限公司 Moment generates the module and its operating system of 3D model equipments control matrix structure
CN108459707A (en) * 2018-01-26 2018-08-28 上海萌王智能科技有限公司 It is a kind of using intelligent terminal identification maneuver and the system that controls robot

Similar Documents

Publication Publication Date Title
CN102955563A (en) Robot control system and method
WO2017115385A3 (en) System and method for operating and controlling a hyper configurable humanoid robot to perform multiple applications in various work environments
CN108563235B (en) Multi-rotor unmanned aerial vehicle, method, device and equipment for grabbing target object
CN105666505A (en) Robot system having augmented reality-compatible display
CN107030692B (en) Manipulator teleoperation method and system based on perception enhancement
CN104180753A (en) Rapid calibration method of robot visual system
CN105563493A (en) Height and direction adaptive service robot and adaptive method
US20160216708A1 (en) Control device for cyber-physical systems
KR102345492B1 (en) Visual tracking of peripheral devices
CN103578135A (en) Virtual image and real scene combined stage interaction integrating system and realizing method thereof
CN113103230A (en) Human-computer interaction system and method based on remote operation of treatment robot
CN106003036A (en) Object grabbing and placing system based on binocular vision guidance
CN113829343A (en) Real-time multi-task multi-person man-machine interaction system based on environment perception
KR20190056174A (en) Robot sytem and control method thereof
TW201310339A (en) System and method for controlling a robot
Lin et al. The implementation of augmented reality in a robotic teleoperation system
WO2018121794A1 (en) Control method, electronic device and storage medium
US10620717B2 (en) Position-determining input device
CN116476074A (en) Remote mechanical arm operation system based on mixed reality technology and man-machine interaction method
Sugiyama et al. A wearable visuo-inertial interface for humanoid robot control
CN109085915B (en) Augmented reality method, system, equipment and mobile terminal
CN113221729B (en) Unmanned aerial vehicle cluster control method and system based on gesture human-computer interaction
CN110631586A (en) Map construction method based on visual SLAM, navigation system and device
CN105710856A (en) Remote motion sensing control robot
CN114800524A (en) System and method for actively avoiding collision of human-computer interaction cooperative robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130306