CN113885404A - Multi-robot cooperative control system based on universal interface - Google Patents

Multi-robot cooperative control system based on universal interface Download PDF

Info

Publication number
CN113885404A
CN113885404A CN202111269043.6A CN202111269043A CN113885404A CN 113885404 A CN113885404 A CN 113885404A CN 202111269043 A CN202111269043 A CN 202111269043A CN 113885404 A CN113885404 A CN 113885404A
Authority
CN
China
Prior art keywords
module
robot
control server
image
api
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111269043.6A
Other languages
Chinese (zh)
Inventor
齐鹏
杨皓冬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202111269043.6A priority Critical patent/CN113885404A/en
Publication of CN113885404A publication Critical patent/CN113885404A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0428Safety, monitoring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/24Pc safety
    • G05B2219/24215Scada supervisory control and data acquisition

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a multi-robot cooperative control system based on a universal interface, which comprises a control server, an extended communication module, a motion control module, an autonomous intelligent module and a log service module, wherein the extended communication module is provided with an API (application program interface), the motion control module, the autonomous intelligent module and the log service module are all connected with the control server through the API, a plurality of robots to be controlled are also connected with the control server through the API, and each robot is provided with a mechanical execution mechanism. Compared with the prior art, the method does not depend on specific hardware equipment, and the control server can be deployed on any equipment according to engineering requirements; the peripheral equipment has strong adaptability, and asynchronous cooperative control of the server can be realized by only designing a set of general API (application program interface) for a plurality of similar hardware equipment, so that the 'plug-in' flexible management of multiple robots is facilitated; the expandability, portability and anti-interference capability of the service are strong.

Description

Multi-robot cooperative control system based on universal interface
Technical Field
The invention relates to the technical field of robot control, in particular to a multi-robot cooperative control system based on a universal interface.
Background
The engineering construction industry is one of the great heads of the national economy, and plays a significant role in promoting the development of the national economy. The machine is a high strength, high risk work. The mechanical environment is usually very harsh, and mechanical dust can cause respiratory tract and lung infection of workers; bright light may cause visual impairment; strong noise harms hearing; the effects of radiation, ozone, etc. are not insignificant. Especially under the high altitude construction scene, the danger index is higher. Machinery is an essential process in the construction industry, but the machinery industry faces the pain points of low productivity, difficult recruitment and low profit. For this reason, it is necessary to introduce a mechanical robot to alleviate the problem of shortage of the machinist, while freeing the worker from the high-strength and high-risk work
The existing mechanical robot mostly takes monomer on-line programming as a main part, so the robot has the characteristics of narrow working range, poor transportability and the like, and does not conform to the current flexible manufacturing concept; although a part of mechanical execution mechanisms of the robot relate to a plurality of mechanical subunits, the embedded controller is adopted for solidification programming, so that the robot cannot adapt to a variable environment and is poor in flexibility.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a multi-robot cooperative control system based on a universal interface, which realizes asynchronous control of multi-robot equipment by a flash communication technology and realizes rapid deployment of control servers under different working conditions by adopting a factory function.
The purpose of the invention can be realized by the following technical scheme:
a multi-robot cooperative control system based on a universal interface comprises a control server, an extended communication module, a motion control module, an autonomous intelligent module and a log service module. The extended communication module is provided with an API interface, and the motion control module, the autonomous intelligent module and the log service module are all connected with the control server through the API interface. All robots to be controlled are also connected with the control server through API interfaces, and each robot is provided with a mechanical execution mechanism;
the control server is used for scheduling each module and performing data management and is also provided with an independent GUI control interface;
the motion control module comprises a motion modeling unit and a trajectory planning unit: the motion modeling unit comprises a mechanical actuating mechanism forward and backward kinematics calculation operation library and a D-H parameter model, and is used for defining a kinematics and dynamics model of the mechanical actuating mechanism; the track planning unit comprises a path planning model of the mechanical actuator and is used for defining the path model of the end effector of the mechanical actuator;
the autonomous intelligent module comprises an image acquisition unit and an image identification unit, wherein the image acquisition unit is used for acquiring images of a working space, and the image identification unit is used for carrying out target detection on an object of interest in the working space;
the log service unit is used for recording errors and exceptions of the control system. Under complex working conditions, the robot equipment may malfunction during the mechanical execution process, and the hard-to-detect error behavior can be faithfully recorded by the server for the operator to review.
The robot further comprises a safety monitoring module, wherein the safety monitoring module is connected with the control server through an API (application program interface) interface and used for stopping the robot when monitoring that a human body or foreign matters enter the operation range of the robot; the robot can prevent accidents caused by the fact that people or other objects illegally enter a working space during actual operation of the robot, and the friendliness of man-machine cooperation is improved.
Further, the extended communication module is connected with the control server and each module through a flash asynchronous communication technology.
Further, the construction process of the cooperative control system comprises the following steps:
s1, designing a general API interface and embedding the general API interface into a control server by combining the robot and a working scene;
s2, opening hardware of the control server, deploying the hardware, and starting the control server;
s3, connecting each robot with an extended communication module through a network and an API (application program interface);
and S4, setting the mechanical parameters and paths of the mechanical actuators of each hardware.
Further, the construction process of the image acquisition unit comprises the following steps:
a1, image acquisition and frame rate calculation steps: calling a hardware shooting resource through an OpenCV library function cv2.VideoCapture (), and intercepting a current image frame through a capture.
A2, image coding step: carrying out format data encoding on the current image frame by adopting Base 64;
a3, image transmission step: encoding and packaging the Base64 of the image into a JSON format for transmission;
a4, image decoding step: firstly, extracting Base64 codes of images from JSON data packets, decoding the codes into binary data through a Base function Base64.b64decode (), and finally writing the decoded data into a specified image cache in a 'wb' mode and outputting the data.
Further, the construction process of the image recognition unit comprises:
b1, establishing a data set of the region of interest, wherein the data set comprises images in a close scene, a long scene, a shadow, an exposure, an occlusion and a confrontation environment scene;
b2, manually labeling the data set by using open source LabelImg software;
b3, training a neural network detection model for the region of interest by using a Yolov3 deep learning network, and solidifying a weight file of the neural network detection model into the control server.
Further, the image acquisition module comprises an industrial camera and transmits the live-action image data of the mechanical execution mechanism in the current working space back to the control server at a certain sampling frequency.
Further, the safety monitoring module comprises an infrared sensor and a buzzer, the infrared sensor is used for judging whether a human body or a foreign body enters the running range of the robot, and the buzzer is used for giving an alarm.
Compared with the prior art, the invention has the following beneficial effects:
1. the invention can realize the cooperative control of multiple robots without depending on specific hardware equipment, the basis for realizing the cooperative control of multiple robots is a control server, the control server can be deployed on any equipment according to engineering requirements, and when the control precision is not high, a PC can be used as the server; when the control performance requirements are strict, a dedicated high bandwidth server needs to be used.
2. The invention has strong adaptability to multiple robots, abstracts hardware equipment into a functional communication interface, and only needs to design a set of universal interfaces according to standards facing the equipment when introducing a novel device, thereby realizing 'plug-in' flexible management; meanwhile, unified scheduling is performed based on the control server, and the expandability and the anti-interference capability of the service are strong.
Drawings
Fig. 1 is a block diagram of a control system according to an embodiment of the present invention.
FIG. 2 is a diagram of a hardware device according to an embodiment of the present invention.
FIG. 3 is an image collected under normal operating conditions in accordance with an embodiment of the present invention.
Fig. 4 is a hardware circuit of a safety monitoring unit according to an embodiment of the present invention.
FIG. 5 is a flowchart of a control method according to an embodiment of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. The present embodiment is implemented on the premise of the technical solution of the present invention, and a detailed implementation manner and a specific operation process are given, but the scope of the present invention is not limited to the following embodiments.
As shown in fig. 1, the embodiment provides a multi-robot cooperative control system based on a general interface, which includes a control server, an extended communication module, a motion control module, an autonomous intelligent module, a log service module, and a security monitoring module. The extended communication module is provided with an API interface, and the extended communication module, the motion control module, the autonomous intelligent module, the log service module and the safety monitoring module are all connected with the control server through the API interface. All the robots to be controlled are also connected with the control server through the API interface, and each robot is provided with a mechanical execution mechanism. In the embodiment, the robot adopts a rectangular coordinate type welding robot and a manipulator based on a ZDevelop controller.
Control server
The control server is used for scheduling each module and managing data and is also provided with an independent GUI control interface. In this embodiment, as shown in fig. 2, the control server is deployed by using a PC to schedule each functional module. In order to prevent the system from being blocked, stopped and interrupted when misoperation and miscalculation occur, so as to cause a crash phenomenon and reduce user experience, the control server in the design of the embodiment adopts a "Try-exception" exception capture structure.
Motion control module
The motion control module is used for controlling the specific motion of the robot by an actuator of the system, and comprises a motion modeling unit and a trajectory planning unit:
(1) the motion modeling unit comprises a mechanical actuating mechanism forward and backward kinematics calculation operation library and a D-H parameter model, and is used for defining a kinematics and dynamics model of the mechanical actuating mechanism; the specific development for the ZDevelop controller in this embodiment is as follows:
based on the motion vector method, the positive kinematics modeling of the robot is carried out, and at the moment, only the line vector of each joint axis needs to be determined, and a connecting rod coordinate system does not need to be established, namely:
Figure BDA0003328017430000041
and further obtaining an exponential form of each motion vector:
Figure BDA0003328017430000051
Figure BDA0003328017430000052
wherein the blank is 0, [ V ]1]、[V2]、[V3]、[V4]Are respectively line vectors v1、v2、v3、ω4Corresponding to the motion rotation, the joint variable is theta ═ dz dy dx θ4]。
When the joint variable is 0, the initial pose is obtained:
Figure BDA0003328017430000053
thus is formed by
Figure BDA0003328017430000054
A positive kinematic model of the effector was obtained as follows:
Figure BDA0003328017430000055
h is the mechanical height of the rotating pair from the origin in the initial state, and h is 100; l is the length of the mechanical manipulator which is an end effector, and is set to be 200; s is the inherent offset distance of the rotating shaft from the center line of the moving shaft in the y direction, and is set as 200;
the four-degree-of-freedom rectangular coordinate robot used in the embodiment is
Figure BDA0003328017430000056
The inverse kinematics solution can be directly carried out through the matrix element characteristics to obtain each joint variable of the inverse solution:
Figure BDA0003328017430000057
(2) the path planning unit comprises a path planning model of the mechanical actuator, and is used for defining the path model of the mechanical actuator end effector. In this embodiment, the trajectory planning unit adopts a circular interpolation control algorithm, which is specifically described as follows:
according to the input starting point and the input ending point, the circle center of the circular interpolation is determined, wherein the circle center is randomly generated on the perpendicular bisector of the starting point and the ending point, and then the moving distance of the next step is further determined according to the position relation (in-circle, out-circle and on-circle) between the current point and the circle. The final movement distance is the manhattan distance between two points, and the movement accuracy (i.e., unit distance) can be set, and the total cycle number is the manhattan distance between two points divided by the movement accuracy. The ideal path and its direction are four in total:
the ideal path is from bottom left to top right: when the current point is above the straight line, moving the unit distance along the positive direction, otherwise moving the unit distance along the positive direction;
the ideal path is from top left to bottom right: when the current point is above the straight line, moving the unit distance along the negative direction, otherwise moving the unit distance along the positive direction;
③ the ideal path is from the upper right to the lower left: when the current point is above the straight line, moving the unit distance along the negative direction, otherwise, moving the unit distance along the negative direction;
the ideal path is from the lower right to the upper left: and when the current point is above the straight line, moving the unit distance along the negative direction, otherwise moving the unit distance along the positive direction.
The actual algorithm of the circular interpolation control algorithm is expressed as follows:
Figure BDA0003328017430000061
three, autonomous intelligent module
The autonomous intelligent module comprises an image acquisition unit and an image identification unit based on a Raspberry Pi controller, the image acquisition unit is used for acquiring images of a working space, and the image identification unit is used for carrying out target detection on an object of interest in the working space.
(1) The image acquisition unit comprises an industrial camera and transmits the live-action image data of the mechanical actuating mechanism in the current working space back to the control server at a certain sampling frequency. The method specifically comprises the following steps:
image acquisition and frame rate calculation: calling a hardware shooting resource through an OpenCV library function cv2.VideoCapture (), and intercepting a current image frame through a capture.
An image encoding step: considering that pictures are long identification information in the HTTP environment, the present embodiment selects one of the most common encoding modes for transmitting 8-Bit byte codes on a network, such as Base64, to encode picture format data;
an image transmission step: in consideration of the standard data transmission format of flash communication, the embodiment encodes and encapsulates the Base64 of the image into the JSON format for transmission;
an image decoding step: firstly, extracting Base64 codes of images from JSON data packets, decoding the codes into binary data through a Base function Base64.b64decode (), and finally writing the decoded data into a specified image cache in a 'wb' mode and outputting the data.
(2) The construction process of the image recognition unit comprises the following steps:
establishing a region of interest ROI dataset: in this embodiment, the ROI is defined as an end effector-a manipulator of a mechanical actuator, and specifically includes images of the manipulator in each scene, such as a close shot, a long shot, a shadow, an exposure, a shielding, and a confrontation environment;
data labeling: manually labeling the data set by using open source LabelImg software to generate a labeled set in an xml format corresponding to the data set;
and thirdly, training a detection model of the mechanical execution device by using a Yolov3 deep learning network, and solidifying the weight file into the control server. Fig. 3 shows the collected and recognized image under the normal working condition of this embodiment.
Four, extension communication module
The extended communication module is connected with the control server and each module through a flash asynchronous communication technology, an API interface in the embodiment adopts an HTTP communication interface defining a plurality of RESTful styles, the server can conveniently call the functional modules in an asynchronous and multi-threaded mode, and part of the interfaces defined in the embodiment are shown in the following table.
Figure BDA0003328017430000071
Figure BDA0003328017430000081
Fifth, log service module
In order to enable a developer to observe system errors and exceptions so as to facilitate system repair and update, and add an output function of an exception log, the embodiment realizes the faithful recording of exception occurrence time and a file where the exception occurs through an exception log handler.
Sixth, safety monitoring module
The safety monitoring module is connected with the control server through an API (application program interface) interface and used for stopping the robot when monitoring that a human body or foreign matters enter the operation range of the robot; the robot can prevent accidents caused by the fact that people or other objects illegally enter a working space during actual operation of the robot, and the friendliness of man-machine cooperation is improved. As shown in fig. 4, the hardware of the safety monitoring unit mainly includes a Raspberry Pi controller, an infrared sensor module, and an active buzzer module. The infrared sensor module detects whether people or other obstacles exist around the mechanical arm equipment; the active buzzer module plays a role in alarming; and the Raspberry Pi controller feeds back the sensing data and the image data acquired by the hardware to the control server.
Furthermore, the infrared sensor module is provided with a pair of infrared transmitting and receiving tubes, the transmitting tube transmits infrared rays with certain frequency, and when the detection direction meets an obstacle (a reflecting surface), the infrared rays are reflected back to be received by the receiving tube. By utilizing the characteristics of the photoresistor, the resistance value of the resistor in the receiving tube receiving the infrared rays is changed, so that the receiving tube circuit generates current and other electrical signal changes. This change, processed by the comparator circuit, will drive the green indicator light to light up. Meanwhile, the electric signal changes, and a digital signal, namely a low-level signal, is output at an output interface after the electric signal is subjected to pre-amplification, filtering, peak detection, shaping and output amplification analog-to-digital conversion. When the I/O port of the active buzzer module inputs high level, the triode is conducted. The buzzer starts to work after being connected with 5V voltage. The module uses an active buzzer, does not need to input an oscillation signal externally, and only needs to be connected with voltage externally.
As described above, the configuration of the control system in the present embodiment is completed.
FIG. 5 shows a specific process of multi-agent cooperative control based on a universal interface according to the present invention:
step S101, designing a general RESTful API interface as shown in the table above by combining the specific working scene and hardware equipment of the embodiment, and embedding the universal RESTful API interface into a control server;
in other embodiments, the hardware carrier of the server may also be adjusted;
step S102, opening hardware, deploying and starting a control server;
in other embodiments, a factory function mode can be selected for deployment of the server, so that the deployment rapidity is improved;
step S103, connecting each mechanical execution body to a control server through a network;
in other embodiments, the connection mode can also be field bus or serial interface;
step S104, selecting respective mechanical parameters and paths for the plurality of mechanical execution bodies on the control panel, wherein the mechanical paths are all selected to be circular arcs in the embodiment;
in other embodiments, other mechanical parameters and paths may be adjusted, such as linear interpolation;
step S105, each mechanical executive body is driven by the control server to realize asynchronous cooperative control;
and step S106, regularly monitoring the working condition and regularly screening the system service log.
The foregoing detailed description of the preferred embodiments of the invention has been presented. It should be understood that numerous modifications and variations could be devised by those skilled in the art in light of the present teachings without departing from the inventive concepts. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the prior art according to the concept of the present invention should be within the scope of protection defined by the claims.

Claims (8)

1. A multi-robot cooperative control system based on a universal interface is characterized by comprising a control server, an extended communication module, a motion control module, an autonomous intelligent module and a log service module, wherein the extended communication module is provided with an API (application programming interface), the motion control module, the autonomous intelligent module and the log service module are all connected with the control server through the API, a plurality of robots to be controlled are also connected with the control server through the API, and each robot is provided with a mechanical execution mechanism;
the control server is used for scheduling each module and performing data management;
the motion control module comprises a motion modeling unit and a track planning unit, wherein the motion modeling unit is used for defining a kinematics and dynamics model of the mechanical actuating mechanism, and the track planning unit is used for defining a path model of an end effector of the mechanical actuating mechanism;
the autonomous intelligent module comprises an image acquisition unit and an image identification unit, wherein the image acquisition unit is used for acquiring images of a working space, and the image identification unit is used for carrying out target detection on an object of interest in the working space;
the log service unit is used for recording errors and exceptions of the control system.
2. The multi-robot cooperative control system based on the universal interface as claimed in claim 1, further comprising a safety monitoring module, wherein the safety monitoring module is connected to the control server through an API interface, and stops the robot when it is detected that a human body or a foreign object enters the operation range of the robot.
3. The multi-robot cooperative control system based on the general interface as claimed in claim 1, wherein the extended communication module connects the control server and each module through a flash asynchronous communication technology.
4. The multi-robot cooperative control system based on the universal interface as claimed in claim 1, wherein the construction process of the cooperative control system comprises:
s1, designing a general API interface and embedding the general API interface into a control server by combining the robot and a working scene;
s2, opening hardware of the control server, deploying the hardware, and starting the control server;
s3, connecting each robot with an extended communication module through a network and an API (application program interface);
and S4, setting mechanical parameters and paths of mechanical actuators of each robot.
5. The system of claim 1, wherein the image capturing unit is constructed by:
a1, image acquisition and frame rate calculation steps: calling a hardware shooting resource through an OpenCV library function cv2.VideoCapture (), and intercepting a current image frame through a capture.
A2, image coding step: carrying out format data encoding on the current image frame by adopting Base 64;
a3, image transmission step: encoding and packaging the Base64 of the image into a JSON format for transmission;
a4, image decoding step: firstly, extracting Base64 codes of images from JSON data packets, decoding the codes into binary data through a Base function Base64.b64decode (), and finally writing the decoded data into a specified image cache in a 'wb' mode and outputting the data.
6. The system of claim 1, wherein the image recognition unit is constructed by:
b1, establishing a data set of the region of interest, wherein the data set comprises images in a close scene, a long scene, a shadow, an exposure, an occlusion and a confrontation environment scene;
b2, manually labeling the data set by using open source LabelImg software;
b3, training a neural network detection model for the region of interest by using a Yolov3 deep learning network, and solidifying a weight file of the neural network detection model into the control server.
7. The system according to claim 1, wherein the image capturing module comprises an industrial camera for transmitting the live-action image data of the mechanical actuator in the working space to the control server at a certain sampling frequency.
8. The multi-robot cooperative control system based on the universal interface as claimed in claim 2, wherein the safety monitoring module comprises an infrared sensor and a buzzer, the infrared sensor is used for detecting whether a human body or a foreign object enters the operation range of the robot, and the buzzer is used for giving an alarm.
CN202111269043.6A 2021-10-29 2021-10-29 Multi-robot cooperative control system based on universal interface Pending CN113885404A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111269043.6A CN113885404A (en) 2021-10-29 2021-10-29 Multi-robot cooperative control system based on universal interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111269043.6A CN113885404A (en) 2021-10-29 2021-10-29 Multi-robot cooperative control system based on universal interface

Publications (1)

Publication Number Publication Date
CN113885404A true CN113885404A (en) 2022-01-04

Family

ID=79014387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111269043.6A Pending CN113885404A (en) 2021-10-29 2021-10-29 Multi-robot cooperative control system based on universal interface

Country Status (1)

Country Link
CN (1) CN113885404A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115070789A (en) * 2022-06-09 2022-09-20 博歌科技有限公司 Multi-robot intelligent control interaction platform

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008005662A2 (en) * 2006-07-05 2008-01-10 Battelle Energy Alliance, Llc Generic robot architecture
JP2017047519A (en) * 2015-09-04 2017-03-09 Rapyuta Robotics株式会社 Cloud robotics system, information processor, program, and method for controlling or supporting robot in cloud robotics system
CN106502095A (en) * 2016-10-27 2017-03-15 福州大学 A kind of cooperative control method of many industrial robots
WO2017072771A1 (en) * 2015-10-28 2017-05-04 Bar-Ilan University Robotic cooperative system
CN111761347A (en) * 2020-06-18 2020-10-13 龙铁纵横(北京)轨道交通科技股份有限公司 Intelligent assembly system and method using repair workshop robot
CN112732450A (en) * 2021-01-22 2021-04-30 清华大学 Robot knowledge graph generation system and method under terminal-edge-cloud cooperative framework

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008005662A2 (en) * 2006-07-05 2008-01-10 Battelle Energy Alliance, Llc Generic robot architecture
JP2017047519A (en) * 2015-09-04 2017-03-09 Rapyuta Robotics株式会社 Cloud robotics system, information processor, program, and method for controlling or supporting robot in cloud robotics system
WO2017072771A1 (en) * 2015-10-28 2017-05-04 Bar-Ilan University Robotic cooperative system
CN106502095A (en) * 2016-10-27 2017-03-15 福州大学 A kind of cooperative control method of many industrial robots
CN111761347A (en) * 2020-06-18 2020-10-13 龙铁纵横(北京)轨道交通科技股份有限公司 Intelligent assembly system and method using repair workshop robot
CN112732450A (en) * 2021-01-22 2021-04-30 清华大学 Robot knowledge graph generation system and method under terminal-edge-cloud cooperative framework

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115070789A (en) * 2022-06-09 2022-09-20 博歌科技有限公司 Multi-robot intelligent control interaction platform

Similar Documents

Publication Publication Date Title
CN111421539A (en) Industrial part intelligent identification and sorting system based on computer vision
CN104647388A (en) Machine vision-based intelligent control method and machine vision-based intelligent control system for industrial robot
CN107263468B (en) SCARA robot assembly method using digital image processing technology
US20190086907A1 (en) Programming a robot by demonstration
Chen et al. Applying a 6-axis mechanical arm combine with computer vision to the research of object recognition in plane inspection
CN107767423A (en) A kind of mechanical arm target positioning grasping means based on binocular vision
CN111347411B (en) Two-arm cooperative robot three-dimensional visual recognition grabbing method based on deep learning
CN102514002A (en) Monocular vision material loading and unloading robot system of numerical control lathe and method thereof
CN112454333B (en) Robot teaching system and method based on image segmentation and surface electromyogram signals
CN113785303A (en) Machine learning object recognition by means of a robot-guided camera
CN108714914B (en) Mechanical arm vision system
CN113379849A (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN113885404A (en) Multi-robot cooperative control system based on universal interface
US10007837B2 (en) Determining the robot axis angle and selection of a robot with the aid of a camera
CN110363811B (en) Control method and device for grabbing equipment, storage medium and electronic equipment
CN108340352A (en) The long-range real-time control method of industrial robot based on teaching joint arm
Han et al. Grasping control method of manipulator based on binocular vision combining target detection and trajectory planning
CN214751405U (en) Multi-scene universal edge vision motion control system
CN111860416A (en) Unmanned aerial vehicle image monitoring control device and control method thereof
CN111098306A (en) Calibration method and device of robot, robot and storage medium
JPH06218682A (en) Robot for assembly
Jung et al. Control of the manipulator position with the kinect sensor
CN113414764A (en) Part warehousing method and device, terminal and readable storage medium
Kamtikar et al. Towards autonomous berry harvesting using visual servoing of soft continuum arm
Abicht et al. Interface-free connection of mobile robot cells to machine tools using a camera system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination