CN114952843A - Micro-assembly operating system based on master-slave cooperation of double robots - Google Patents

Micro-assembly operating system based on master-slave cooperation of double robots Download PDF

Info

Publication number
CN114952843A
CN114952843A CN202210598787.0A CN202210598787A CN114952843A CN 114952843 A CN114952843 A CN 114952843A CN 202210598787 A CN202210598787 A CN 202210598787A CN 114952843 A CN114952843 A CN 114952843A
Authority
CN
China
Prior art keywords
robot
micro device
slave
controller
master
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210598787.0A
Other languages
Chinese (zh)
Other versions
CN114952843B (en
Inventor
樊启高
巫亦浩
黄文涛
刘跃跃
毕恺韬
谢林柏
朱一昕
艾建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangnan University
Original Assignee
Jiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangnan University filed Critical Jiangnan University
Priority to CN202210598787.0A priority Critical patent/CN114952843B/en
Publication of CN114952843A publication Critical patent/CN114952843A/en
Application granted granted Critical
Publication of CN114952843B publication Critical patent/CN114952843B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a master-slave cooperation micro-assembly operating system based on double robots, which relates to the technical field of micro electro mechanical systems, and comprises two micro-operation robots, wherein position feedback information of the two robots can be determined through real-time working images, a main robot can move according to a set target track through closed-loop control, the motion of the main robot refers to the state of the main robot, and the two robots achieve cooperative control; because the system utilizes the two micro-operation robots to cooperatively operate the micro device based on the master-slave control strategy, compared with the conventional system of a single micro-operation robot, the system adopts the method of cooperative operation of the two robots to greatly improve the aspects of information acquisition, processing, control capability and the like, and therefore, the control capability and the operation precision of the micro device are higher.

Description

Micro-assembly operating system based on master-slave cooperation of double robots
Technical Field
The application relates to the technical field of micro-electro-mechanical systems, in particular to a micro-assembly operating system based on master-slave cooperation of double robots.
Background
The micro-electromechanical technology is an advanced technology which is related to national economic development and technical progress, and the micro-assembly technology is the basic core of the research of the micro-electromechanical technology at present. The micro-assembly system can realize the combined assembly of micro devices, for example, the micro-electromechanical devices can be combined to form a micro-electromechanical system with specific functions, and even cells or tissues in organisms can be stacked and built to form a human biological organ. Therefore, the micro-assembly system is widely applied to the fields of aerospace, military defense, bioengineering and the like.
The micro device is gripped and released by a micro assembly system, which is an important part, and it is a conventional practice to grip and release the micro device by using a micro robot. However, unlike the large-sized device, the micro device has a small size and requires high precision in gripping and releasing, but the gripper of the micro-robot is affected by the gap and eccentricity, and in the micro field, electrostatic attraction and van der waals force tend to dominate, so that the micro device is attracted to the gripper of the micro-robot when the gripper is released. These reasons all result in that the micro-operation robot has great difficulty in gripping the micro-device or releasing the micro-device to a set position, and the operation precision is difficult to guarantee.
Disclosure of Invention
In view of the above problems and technical needs, the present applicant proposes a micro-assembly operating system based on dual-robot master-slave cooperation, and the technical solution of the present application is as follows:
a micro-assembly operating system based on master-slave cooperation of double robots comprises a controller, a master robot, slave robots, a vision module and an objective platform, wherein the master robot and the slave robots are both three-freedom-degree micro-operation robots, the tail ends of the master robot and the slave robots are provided with a clamp holder, micro devices are placed on the objective platform, the vision module faces the objective platform, and the vision module covers the micro devices and the tail ends of the two robots; the controller is connected with the vision module, the master robot and the slave robot;
the controller collects real-time working images through the vision module at each sampling moment, and performs image recognition on the real-time working images to determine position feedback information p of the main robot under the image coordinates m And position feedback information p from the robot in the image coordinate system s
Under the state that the micro device is clamped by the main robot and the slave robot, the controller is used for controlling the micro device to move according to the target position information p of the micro device in the image coordinate system i And position feedback information p of the main robot m Closed-loop controlling the movement of the main robot to track target position information of the micro device, target position information p i The position information of the central point of the micro device indicated by the target track of the micro device in the image coordinate system at the current sampling moment; the controller feeds back information p according to the position of the main robot m And position feedback information p from the robot s And closed-loop control is carried out on the motion of the slave robot to track the position feedback information of the master robot, and the micro device is cooperatively operated by the master robot and the slave robot to move along the target track.
The further technical scheme is that when the controller controls the main robot, the tracking error p of the main robot in the image coordinate system is used ipa -p m As input of the main PID controller, and determining the motion increment u of the main robot under the robot coordinate system based on the output of the main PID controller m And according to motion increment u m Controlling the main robot; wherein, delta pa Is the relative position information in the robot coordinate system between the center point of the micro device and the first target gripping position of the main robot on the micro device.
The further technical proposal is that when the controller controls the slave robot, the controller controls the tracking error p of the slave robot in the image coordinate system mpb -p s As an input from the PID controller,and determining motion increments u of the slave robot in the robot coordinate system based on the output from the PID controller s And according to motion increment u s Controlling the slave robot; wherein, delta pb Is the relative position information in the robot coordinate system between a first target gripping position of the master robot on the micro device and a second target gripping position of the slave robot on the micro device.
In the initial state, the master robot and the slave robot are not in contact with the micro device, and the controller performs image recognition on the real-time working image to determine the initial position information p of the central point of the micro device in an image coordinate system; the controller is based on the initial position information p of the micro device and the position feedback information p of the main robot m The master robot is controlled in a closed loop to move to a first target clamping position to contact and clamp the micro device, and the controller feeds back information p based on the first target clamping position and the position of the slave robot s Closed loop control moves from the robot to move to a second target gripping location to contact and grip the micro device; the first target clamping position, the second target clamping position and the initial position information of the micro device have a predetermined position relationship.
The further technical proposal is that the controller uses the tracking error p + delta of the main robot under the image coordinate system pa -p m As input of the main PID controller, and determining the motion increment u of the main robot under the robot coordinate system based on the output of the main PID controller m And according to motion increment u m Controlling the main robot to move;
and the controller controls the tracking error h of the robot under the image coordinate system mpb -p s As input from the PID controller and based on the output from the PID controller, determining the motion increment u of the slave robot in the robot coordinate system s And according to motion increment u s Controlling the slave robot to move;
wherein h is m Is the position of the first target clamping position of the main robot on the micro device in the image coordinate system, delta pa Is the center point of the micro device and the hostRelative position information of the robot in the robot coordinate system between the first target clamping positions on the micro device; delta. for the preparation of a coating pb Is the relative position information in the robot coordinate system between a first target gripping position of the master robot on the micro device and a second target gripping position of the slave robot on the micro device.
The method comprises the following steps that before the controller controls the master robot and the slave robot to cooperatively operate the micro device, the controller detects whether a first height and a second height are equal, wherein the first height is the height from the tail end of the master robot to the plane where the loading platform is located at the initial position of the master robot, and the second height is the height from the tail end of the slave robot to the plane where the loading platform is located at the initial position of the slave robot;
when the first height is equal to the second height, the height calibration of the two robots is completed, and the master robot is in the initial position of the master robot and the slave robot is in the initial position of the slave robot; and when the first height is not equal to the second height, adjusting the initial position of the main robot and/or the initial position of the slave robot until the first height is equal to the second height.
The controller controls the main robot to move at a constant speed along the direction of the z axis from the initial position of the main robot, controls the auxiliary robot to move at a constant speed along the direction of the z axis from the initial position of the auxiliary robot, and detects whether the first height is equal to the second height according to the image coordinate of a gripper at the tail end of the robot under an image coordinate system, wherein the direction of the z axis is vertical to the plane where the carrying platform is located.
The further technical scheme is that for any one of the master robot and the slave robot, the controller controls the robot to move from a corresponding initial position to the loading platform along the z-axis direction and to contact with the loading platform; in the moving process of the robot, the image coordinate of the gripper at the tail end of the robot under the image coordinate is firstly reduced and then increased, namely when the image coordinate of the robot is minimum (the distance between the initial position corresponding to the robot and the minimum point of the image coordinate is taken as the height between the initial position corresponding to the robot and the plane where the loading platform is located), the robot just contacts with the loading platform at the moment, and the direction of the z axis of the two robots is determined to be consistent. .
The further technical scheme is that the carrying platform is a two-degree-of-freedom moving platform, the controller is connected with and controls the carrying platform, and the controller controls the carrying platform to move so that the initial position of the micro device is in the field range of the vision module.
The method further comprises the following step that before the controller controls the master robot and the slave robot to cooperatively operate the micro device, the micro device is subjected to posture adjustment by the master robot and/or the slave robot so as to achieve a target posture.
The beneficial technical effect of this application is:
compared with a conventional single micro-operation robot system, the micro-assembly operation system adopting the two robot cooperative operation methods has relatively great improvement in the aspects of information acquisition, processing, control capability and the like, so that the micro-assembly operation system has relatively high control capability and operation precision on the micro-device.
In the system, the controller controls the two robots by utilizing the PID controller, so that the main robot moves according to the set target track, the motion of the slave robot refers to the state of the main robot, the master robot and the slave robot achieve cooperative control, the following effect of master-slave control is good, the error is small, and the PID controller is simple in structure and good in operation effect.
Drawings
FIG. 1 is a system diagram that illustrates a microfabricated operating system, in one embodiment.
FIG. 2 is a schematic diagram illustrating the process of the dual-robot master-slave cooperative operation of the micro device to move along a target trajectory in one embodiment.
Fig. 3 is a logic block diagram of the closed loop control of the controller to the two robots in the embodiment shown in fig. 2.
FIG. 4 is a schematic flow chart of the dual-robot master-slave cooperative operation to complete the clamping operation of the micro device in another embodiment.
Fig. 5 is a logic block diagram of the closed loop control of the controller to the two robots of the embodiment shown in fig. 4.
Fig. 6 is a schematic flow chart of initial calibration adjustment of the system before the dual robots perform master-slave cooperative operation on the micro devices in one embodiment.
Detailed Description
The following description of the embodiments of the present application will be made with reference to the accompanying drawings.
The application discloses a micro-assembly operating system based on master-slave cooperation of two robots, please refer to fig. 1, and the micro-assembly operating system comprises a controller 1, a master robot 2, a slave robot 3, a vision module 4 and a loading platform 5. The master robot 2 and the slave robot 3 are both three-degree-of-freedom micro-operation robots and have holders at the ends thereof, in one embodiment, the master robot 2 and the slave robot 3 are both realized by adopting a Senscapex micro-operation robot, the holders are mounted at the ends of the Senscapex micro-operation robot, and the holders are tungsten probes with tips of tens of micrometers.
The micro device 6 is placed on the stage 5 and the entire operation of the micro device 6 is within the operation range of both robots. In general, the main robot 2 and the slave robot 3 may be disposed on both sides of the loading platform 5 along the horizontal plane x direction as shown in fig. 1.
The vision module 4 is directed towards the stage 5 and the field of view covers the micro device 6 and the ends of the two robots. In one embodiment, the vision module 4 includes a CCD camera 41, and the CCD camera 41 is used to capture images and upload them to the controller for processing for vision servo. In another embodiment, the vision module 4 further comprises a microscope 42, 0, such as an olympus microscope, the microscope 42 is calibrated for full-range monitoring of the micro device 6 and the holders at the two robot ends during the micro assembly operation, and the object in the field of view is imaged on the CCD camera 41 by the magnification of the microscope 42, so that the image captured by the vision module 4 is clearer.
The controller 1 is connected with the vision module 4, the master robot 2 and the slave robot 3. In one embodiment, the controller 1 comprises an upper computer 11 and a Sensapex micro-manipulator controller 12 connected with the upper computer 11, the upper computer 11 is connected with the vision module 4, and the Sensapex micro-manipulator controller 12 is connected with and controls the main robot 2 and the slave robot 3.
When the system is applied, the micro device 6 is placed on the carrying platform 5, the main robot 2 and the slave robot 3 clamp the micro device 6 through the tail end clamp, the main robot 2 clamps the micro device 6 at a first target clamping position on the micro device 6, and the slave robot 3 clamps the micro device 6 at a second target clamping position. And in the image coordinate system, the positions of the two target clamping positions in the image coordinate system and the position c of the central point of the micro device 6 in the image coordinate system have the following inherent relationship:
Figure BDA0003669135260000051
wherein h is m Is the position of the first target clamping position of the main robot 2 on the micro device 6 in the image coordinate system, h s Is the position in the image coordinate system from the second target gripping position of the robot 3 on the micro device 6. Delta pa Is the relative position information in the robot coordinate system between the center point of the micro device 6 and the first target gripping position of the main robot 2 on the micro device. Delta pb Is the relative position information in the robot coordinate system between the first target gripping position of the master robot 2 on the micro device 6 and the second target gripping position of the slave robot 3 on the micro device 6. The relative position information reflects the distance and direction, that is, the first target holding position can be determined by the center point c of the micro device 6, and the second target holding position can be determined by the first target holding position. In general, the two target clamping positions are respectively the middle points of the two sides of the micro device 6, so that the first target clamping position, the second target clamping position and the center point of the micro device 6 are on the same line parallel to the x direction of the robot coordinate system, and thus the relative position information indicates the distance between the two points. Clamping of the micro device 6 four vertices of the micro device 6 are detected by harris corner detection algorithm to determine the micro device 6The two sides are in the middle, so that two target clamping positions are determined.
In a state where the master robot 2 and the slave robot 3 both clamp the micro device, the process of the controller 1 performing the micro assembly operation on the micro device 6 by using the master-slave cooperation of the dual robots includes the following steps, please refer to the flowchart shown in fig. 2:
in step 110, the controller 1 applies the improved artificial potential field method to generate a target trajectory of the micro device 6 from the initial position to the end position, where the target trajectory indicates position information of the center point of the micro device 6 in the image coordinate system at different times. The image coordinate system refers to a coordinate system perpendicular to the image plane of the vision module 4 (specifically, the CCD camera 41) mounted on the stage 5, and the xy plane of the image coordinate system is parallel to the stage 5.
Step 120, at each sampling moment, the controller 1 collects a real-time working image through the vision module 4, and performs image recognition on the real-time working image to determine the position feedback information p of the main robot 2 under the image coordinate m And position feedback information p from the robot 3 in the image coordinate system s . It should be noted that the position feedback information of the robot in the present application indicates the position of the gripper end on the robot, and the position feedback information of the micro device 6 indicates the position of the center point of the micro device 6.
In step 130, the controller 1 determines the target position p of the micro device 6 in the image coordinate system i And position feedback information p of the main robot 2 m Closed-loop controlling the main robot 2 to move to track the target position information of the micro device 6, target position information p i Is the position information of the center point of the micro device indicated by the target trajectory of the micro device 6 in the image coordinate system at the current sampling instant. And, the controller 1 feeds back the information p according to the position of the main robot 3 m And position feedback information p from the robot 3 s The closed loop control moves from the robot 3 to track the position feedback information of the master robot 2.
If the micro device 6 has not reached the end position, the above steps 120 and 130 are repeated continuously at the next sampling time until the micro device 6 reaches the end position, thereby moving the micro device 6 along the target trajectory by the cooperation of the master robot 2 and the slave robot 3. When the micro device 6 reaches the end point position, the master robot and the slave robot complete the cooperative operation of the micro device 6, and can release the clamping of the micro device 6, so that the micro device 6 is placed at the end point position to be placed.
The controller 1 implements the closed-loop control for the two robots by using a PID controller, please refer to the control schematic diagram shown in fig. 3. When the controller 1 controls the main robot 2, the tracking error p of the main robot 2 in the image coordinate system is used ipa -p m As input of the main PID controller, and based on the output of the main PID controller, determining the motion increment u of the main robot 2 in the robot coordinate system m And according to motion increment u m Controlling the main robot. In practical application, the main PID controller outputs the motion increment u of the main robot 2 in the image coordinate system c1 Then to u c1 The motion increment u of the main robot 2 under the robot coordinate system can be obtained through the transformation matrix T1 m =u c1 XT 1. The transformation matrix T1 is a transformation matrix obtained by previously calibrating the robot coordinate system in which the main robot 2 is located and the image coordinate system.
Similarly, the controller 1 controls the slave robot 3 so as to obtain the tracking error p of the slave robot 3 in the image coordinate system mpb -p s As input from the PID controller and based on the output from the PID controller, determining the motion increment u of the slave robot in the robot coordinate system s And according to motion increment u s The slave robot 3 is controlled. Output from the PID controller is also the motion increment u in the image coordinate system from the robot 3 c2 Then to u c2 The motion increment u of the slave robot 3 in the robot coordinate system can be obtained through the transformation matrix T2 s =u c2 XT 2. The transformation matrix T2 is obtained by previously calibrating the robot coordinate system in which the robot 3 is located and the image coordinate system.
The embodiment shown in fig. 2 describes the process of the main robot 2 and the slave robot 3 cooperatively clamping the micro device to move along the target track, but in the initial state of the system, generally, the micro device 6 is statically placed on the loading platform 5 at the initial position, the main robot 2 is located at the initial position of the main robot, the slave robot 3 is located at the initial position of the slave robot, and neither robot is in contact with the micro device 6, so that the controller 1 needs to accurately control the two robots to move from the initial position to the target clamping position to clamp the micro device 6. Referring to the flowchart shown in fig. 4, the method includes the following steps:
step 410, the controller 1 collects a real-time working image through the vision module 4, and performs image recognition on the real-time working image to determine initial position information p of the central point of the micro device in an image coordinate system.
In step 420, the controller 1 feeds back information p based on the initial position information p of the micro device and the position of the main robot 2 m The main robot 2 is controlled to move in a closed loop. And, the controller 1 clamps the position h based on the first target m And position feedback information p from the robot 3 s Closed loop control moves from the robot 3. First target holding position h m A second target holding position h s And the initial position information of the micro device, the positional relationship is as described above.
Step 430, if based on the position feedback information p of the main robot 2 m Determining that the main robot 2 moves to the first target gripping position h m Contacting and holding the micro device and feeding back information p based on the position of the micro device from the robot s Determining the movement from the robot 3 to the second target gripping position h s Where the micro device 6 is contacted and clamped, the process is completed. Otherwise steps 410 and 420 are re-executed until both robots grip the micro device 6.
In the embodiment shown in fig. 4, in step 420, the controller 1 also uses a PID controller to control the master-slave robot, please refer to the control block diagram shown in fig. 5. The controller 1 uses the tracking error p + delta of the main robot 2 in the image coordinate system pa -p m As input of the main PID controller, and determining the motion increment u of the main robot under the robot coordinate system based on the output of the main PID controller m And according to motion increment u m Controlling the main robot motion also requires the transformation using transformation matrix T1, and this embodiment will not be described again. Controller 1 to determine the tracking error h of slave robot 3 in the image coordinate system mpb -p s As input from the PID controller and based on the output from the PID controller, determining the motion increment u of the slave robot in the robot coordinate system s And according to motion increment u s The control is moved from the robot and likewise needs to be transformed using the transformation matrix T2, and this embodiment will not be described again.
In practical application, the method of fig. 2 or fig. 4 is executed only when the micro-assembly operating system needs to be initialized, adjusted and calibrated, so as to improve accuracy, where the initialized, adjusted and calibrated mainly includes the following three aspects, please refer to the flowchart shown in fig. 6:
firstly, adjusting the field range.
The controller 1 needs to rely on the real-time working image collected by the vision module 4 for controlling the two robots, the robots can ensure to enter the view field range of the vision module 4 through self movement, but if the initial position of the micro device 6 is not in the view field range, the automatic adjustment cannot be carried out, and the micro device 6 is very small in size and difficult to realize through manual adjustment. Therefore, in this embodiment, the object stage 5 is a two-degree-of-freedom mobile platform, the controller 1 is connected to and controls the object stage 5, fig. 1 does not show the connection relationship between the two, and the controller 1 controls the object stage 5 to move so that the initial position of the micro device 6 is within the field of view of the vision module 4, and if the initial position of the micro device 6 is within the field of view of the vision module 4, the adjustment of this step can be skipped. In practical application, PriorH117 may be selected as the carrier platform 5.
And adjusting the initial positions of the two robots.
Because two robots need to be used for cooperative control, it is necessary to ensure that the vertical distances between the two robots and the micro device 6 at their respective initial positions are equal, and because the micro device 6 is placed on the loading platform 5, it is also necessary to ensure that the vertical distances between the two robots and the plane where the loading platform 5 is located at their respective initial positions are equal.
Therefore, before the controller 1 controls the main robot 2 and the slave robot 3 to cooperatively operate the micro device, the controller 1 first detects whether a first height and a second height are equal, wherein the first height is a height from the end of the main robot 2 to the plane of the carrier platform 5 at the initial position of the main robot, the second height is a height from the end of the slave robot 4 to the plane of the carrier platform 5 at the initial position of the slave robot, and the z-axis direction is perpendicular to the plane of the carrier platform.
When the first height is equal to the second height, the height calibration of the two robots is completed, and the method shown in fig. 4 and 2 can be executed to realize cooperative control by taking the master robot at the master robot initial position and the slave robot at the slave robot initial position as initial states. And when the first height is not equal to the second height, adjusting the initial position of the main robot and/or the initial position of the slave robot until the first height is equal to the second height.
In one embodiment, the method for the controller 1 to detect whether the first height is equal to the second height is: the controller 1 controls the main robot 2 to move at a constant speed along the z-axis direction from the initial position of the main robot, controls the auxiliary robot 3 to move at a constant speed along the z-axis direction from the initial position of the corresponding auxiliary robot, and detects whether the first height is equal to the second height according to the image coordinate of the gripper at the tail end of the robot under the image coordinate system, wherein the z-axis direction is vertical to the plane where the loading platform is located.
Specifically, the method comprises the following steps: for any one of the master robot 2 and the slave robot 3, the controller 1 controls the robot to move from the corresponding initial position toward and in contact with the stage along the z-axis direction. In the moving process, a real-time working image is obtained through the vision module 4, and in the moving process of the robot, before the robot moves to the objective platform 5 and contacts with the objective platform 5, the image coordinate of the gripper at the tail end of the robot under the image coordinate system is gradually reduced. When the robot further moves after contacting the objective platform 5, the gripper at the end of the robot deforms and slides along the horizontal direction of the objective platform 5, so that the image coordinates of the gripper at the end of the robot in the image coordinate system become larger gradually.
Therefore, in the moving process of the robot, the image coordinate of the gripper at the tail end of the robot in the image coordinate system is firstly reduced and then increased, and the position with the minimum image coordinate is taken as the position where the robot moves to just contact with the loading platform 5. And taking the moving distance between the initial position corresponding to the robot and the minimum point of the image coordinate as the height between the initial position corresponding to the robot and the plane where the loading platform is located, namely determining the first height and the second height, and then detecting whether the first height and the second height are equal.
And thirdly, adjusting the initial posture of the micro device.
In order to facilitate the two robots to clamp and operate the micro device 6, the initial posture of the micro device 6 can be adjusted, and if the initial posture of the micro device 6 is not the target posture, the posture of the micro device 6 is adjusted by using the master robot 2 and/or the slave robot 3 to achieve the target posture. If the initial attitude of the micro device 6 is the target attitude, the step adjustment may be skipped.
The target posture is a posture set in advance to facilitate the gripping operation of the two robots, for example, the orientation of the micro device 6 in the target posture is the y-axis direction of the robot coordinate system. Specifically, the method for adjusting the posture of the micro device 6 by using the two robots is determined according to the posture of the micro device 6 and the arrangement positions of the two robots, and the adjustment can be completed according to actual conditions.
For example, in one example, if the micro device 6 is biased to the right, the robot on the left side of the micro device 6 is fixed, and the robot on the right side of the micro device 6 is moved in the x-axis direction to adjust the micro device 6 to the y-axis direction of the robot coordinate system. Similarly, when the micro device 6 is biased to the left, the robot on the right side of the micro device 6 is fixed, and the robot on the left side of the micro device 6 moves in the x-axis direction to adjust the micro device 6 to face the y-axis direction of the robot coordinate system.
What has been described above is only a preferred embodiment of the present application, and the present application is not limited to the above examples. It is to be understood that other modifications and variations directly derived or suggested to those skilled in the art without departing from the spirit and concepts of the present application are to be considered as being within the scope of the present application.

Claims (10)

1. A micro-assembly operating system based on master-slave cooperation of double robots is characterized by comprising a controller, a master robot, a slave robot, a vision module and an object carrying platform, wherein the master robot and the slave robot are both three-freedom-degree micro-operation robots, the tail ends of the master robot and the slave robot are provided with a clamp holder, a micro device is placed on the object carrying platform, the vision module faces the object carrying platform, and the vision module covers the micro device and the tail ends of the two robots; the controller is connected with the vision module, the master robot and the slave robot;
the controller collects real-time working images through the visual module at each sampling moment, and carries out image recognition on the real-time working images to determine position feedback information p of the main robot under image coordinates m And position feedback information p of the slave robot in an image coordinate system s
Under the state that the micro device is clamped by the master robot and the slave robot, the controller is used for controlling the micro device to move according to the target position information p of the micro device in the image coordinate system i And position feedback information p of the main robot m Closed-loop controlling the main robot to move to track the target position information of the micro device, wherein the target position information p i The position information of the central point of the micro device indicated by the target track of the micro device in the image coordinate system at the current sampling moment; the controller feeds back information p according to the position of the main robot m And position feedback information p of the slave robot s And performing closed-loop control on the motion of the slave robot to track the position feedback information of the master robot, and operating the micro device to move along the target track by using the master robot and the slave robot in a cooperative mode.
2. A microfabricated operating system according to claim 1 wherein the controller is controlling the host machineIn human, the tracking error p of the main robot in the image coordinate system ipa -p m As input of a main PID controller, and determining motion increment u of the main robot in a robot coordinate system based on output of the main PID controller m And according to motion increment u m Controlling the main robot; wherein, delta pa Is the relative position information in the robot coordinate system between the center point of the micro device and the first target gripping position of the main robot on the micro device.
3. A microassembly operating system according to claim 1, wherein said controller controls said slave robot with a tracking error p of said slave robot in an image coordinate system when controlling said slave robot mpb -p s As input from the PID controller, and based on the output from the PID controller, determining the motion increment u of the slave robot in the robot coordinate system s And according to motion increment u s Controlling the slave robot; wherein, delta pb Is the relative position information in the robot coordinate system between a first target gripping position of the master robot on the micro device and a second target gripping position of the slave robot on the micro device.
4. A microfabricated operating system according to claim 1,
in an initial state, the master robot and the slave robot are not in contact with the micro device, and the controller performs image recognition on a real-time working image to determine initial position information p of a central point of the micro device in an image coordinate system; the controller is based on the initial position information p of the micro device and the position feedback information p of the main robot m The master robot is controlled in a closed loop to move to a first target clamping position to contact and clamp the micro device, and the controller feeds back information p based on the first target clamping position and the position of the slave robot s Closed loop control of the slave robot movement to moveMoving to a second target clamping position to contact and clamp the micro device; the first target clamping position, the second target clamping position and the initial position information of the micro device have a preset position relation.
5. A microfabricated operating system according to claim 4,
the controller uses the tracking error p + delta of the main robot under the image coordinate system pa -p m As input of a main PID controller, and determining motion increment u of the main robot in a robot coordinate system based on output of the main PID controller m And according to motion increment u m Controlling the main robot to move;
and the controller uses the tracking error h of the slave robot in the image coordinate system mpb -p s As input from the PID controller, and based on the output from the PID controller, determining the motion increment u of the slave robot in the robot coordinate system s And according to motion increment u s Controlling the slave robot to move;
wherein h is m Is the position of the first target clamping position of the main robot on the micro device in the image coordinate system, delta pa Is relative position information in a robot coordinate system between a center point of the micro device and a first target gripping position of the main robot on the micro device; delta pb Is the relative position information in the robot coordinate system between a first target gripping position of the master robot on the micro device and a second target gripping position of the slave robot on the micro device.
6. The microassembly manipulation system of claim 1, wherein the controller detects whether a first height and a second height are equal before controlling the master robot and the slave robot to cooperatively manipulate the micro device, the first height being a height of the end of the master robot from a plane of the carrier platform at a master robot initial position, the second height being a height of the end of the slave robot from the plane of the carrier platform at a slave robot initial position;
when the first height is equal to the second height, completing height calibration of the two robots, and setting the master robot at the initial position of the master robot and the slave robot at the initial position of the slave robot as initial states; and when the first height is not equal to the second height, adjusting the initial position of the master robot and/or the initial position of the slave robot until the first height is equal to the second height.
7. The microassembly manipulation system of claim 6, wherein the controller controls the master robot to move from the master robot initial position at a constant speed along the z-axis direction, controls the slave robot to move from the slave robot initial position at a constant speed along the z-axis direction, and detects whether the first height is equal to the second height according to the image coordinates of a gripper at the end of the robot in an image coordinate system, wherein the z-axis direction is perpendicular to the plane of the carrier platform.
8. The microassembly manipulation system of claim 7, wherein for any one of the master robot and the slave robot, the controller controls the robot to move from a corresponding initial position along the z-axis direction toward and into contact with the stage; in the moving process of the robot, the image coordinate of the gripper at the tail end of the robot under the image coordinate is firstly reduced and then increased, and the distance between the initial position corresponding to the robot and the minimum point of the image coordinate is used as the height from the initial position corresponding to the robot to the plane where the object carrying platform is located.
9. The microfabricated operating system of claim 1, wherein the stage is a two degree of freedom motion stage, and the controller is coupled to and controls the stage, the controller controlling the stage to move such that the initial position of the microdevice is within the field of view of the vision module.
10. A microfabricated operating system according to claim 1, wherein the controller performs pose adjustment of the microdevice with the master robot and/or the slave robot to achieve a target pose before controlling the master robot and the slave robot to cooperatively operate the microdevice.
CN202210598787.0A 2022-05-30 2022-05-30 Micro-assembly operating system based on master-slave cooperation of double robots Active CN114952843B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210598787.0A CN114952843B (en) 2022-05-30 2022-05-30 Micro-assembly operating system based on master-slave cooperation of double robots

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210598787.0A CN114952843B (en) 2022-05-30 2022-05-30 Micro-assembly operating system based on master-slave cooperation of double robots

Publications (2)

Publication Number Publication Date
CN114952843A true CN114952843A (en) 2022-08-30
CN114952843B CN114952843B (en) 2023-02-28

Family

ID=82957734

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210598787.0A Active CN114952843B (en) 2022-05-30 2022-05-30 Micro-assembly operating system based on master-slave cooperation of double robots

Country Status (1)

Country Link
CN (1) CN114952843B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0863214A (en) * 1994-08-25 1996-03-08 Fanuc Ltd Visual tracking method
CN111590594A (en) * 2020-06-22 2020-08-28 南京航空航天大学 Robot trajectory tracking control method based on visual guidance
CN111890348A (en) * 2019-05-06 2020-11-06 广州中国科学院先进技术研究所 Control method and device for double-robot cooperative transportation
CN113305851A (en) * 2021-06-17 2021-08-27 东莞理工学院 Online detection device for robot micro-assembly

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0863214A (en) * 1994-08-25 1996-03-08 Fanuc Ltd Visual tracking method
CN111890348A (en) * 2019-05-06 2020-11-06 广州中国科学院先进技术研究所 Control method and device for double-robot cooperative transportation
CN111590594A (en) * 2020-06-22 2020-08-28 南京航空航天大学 Robot trajectory tracking control method based on visual guidance
CN113305851A (en) * 2021-06-17 2021-08-27 东莞理工学院 Online detection device for robot micro-assembly

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵玮等: "微操作机器人的视觉伺服控制", 《机器人》 *

Also Published As

Publication number Publication date
CN114952843B (en) 2023-02-28

Similar Documents

Publication Publication Date Title
Fatikow et al. Microrobot system for automatic nanohandling inside a scanning electron microscope
CN108972557B (en) Micro-part pose automatic alignment device and method
Schmoeckel et al. Remotely controllable mobile microrobots acting as nano positioners and intelligent tweezers in scanning electron microscopes (SEMs)
CN107351084B (en) Space manipulator system error correction method for maintenance task
Wang et al. Automated 3-D micrograsping tasks performed by vision-based control
CN108858202B (en) Control method of part grabbing device based on alignment, approach and grabbing
Wang et al. Automatic microassembly using visual servo control
Xing et al. Precision assembly among multiple thin objects with various fit types
Komati et al. Automated robotic microassembly of flexible optical components
US10207413B2 (en) End effector, robot, and robot control apparatus
Fatikow Automated micromanipulation desktop station based on mobile piezoelectric microrobots
Zhou et al. Automatic dextrous microhandling based on a 6-DOF microgripper
CN114952843B (en) Micro-assembly operating system based on master-slave cooperation of double robots
Chu et al. Dual-arm micromanipulation and handling of objects through visual images
Ren et al. 3-D automatic microassembly by vision-based control
Bolopion et al. Stable haptic feedback based on a dynamic vision sensor for microrobotics
Fatikow et al. Microrobot system for automatic nanohandling inside a scanning electron microscope
Huang et al. Development of a robotic microassembly system with multi-manipulator cooperation
CN113771042B (en) Vision-based method and system for clamping tool by mobile robot
Dafflon et al. A micromanipulation setup for comparative tests of microgrippers
Power et al. Direct laser written passive micromanipulator end-effector for compliant object manipulation
Hulsen et al. Control system for the automatic handling of biological cells with mobile microrobots
Kunt et al. Design and development of workstation for microparts manipulation and assembly
Xing et al. A sequence of micro-assembly for irregular objects based on a multiple manipulator platform
CN112123329A (en) Robot 3D vision hand-eye calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant