CN114750160A - Robot control method, robot control device, computer equipment and storage medium - Google Patents

Robot control method, robot control device, computer equipment and storage medium Download PDF

Info

Publication number
CN114750160A
CN114750160A CN202210527441.1A CN202210527441A CN114750160A CN 114750160 A CN114750160 A CN 114750160A CN 202210527441 A CN202210527441 A CN 202210527441A CN 114750160 A CN114750160 A CN 114750160A
Authority
CN
China
Prior art keywords
robot
positioning robot
horizontal
vertical
image acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210527441.1A
Other languages
Chinese (zh)
Other versions
CN114750160B (en
Inventor
刘传真
郝瑜
洪俊填
叶国豪
王光能
张国平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dazu Robot Co ltd
Original Assignee
Shenzhen Dazu Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dazu Robot Co ltd filed Critical Shenzhen Dazu Robot Co ltd
Priority to CN202210527441.1A priority Critical patent/CN114750160B/en
Publication of CN114750160A publication Critical patent/CN114750160A/en
Application granted granted Critical
Publication of CN114750160B publication Critical patent/CN114750160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The application relates to a robot control method, a device, a computer device, a storage medium and a computer program product, wherein a vertical positioning robot, a horizontal positioning robot and an image acquisition device are calibrated to the same world coordinate system, after a product to be calibrated is positioned by the image acquisition device on the vertical positioning robot, the vertical position of the image acquisition device on the vertical positioning robot is obtained, after the product to be calibrated is positioned by the image acquisition device on the horizontal positioning robot, the horizontal position of the image acquisition device on the horizontal positioning robot is obtained, a first error between the vertical position and a basic model and a second error between the horizontal position and the basic model are calculated, when the first error and the second error are within an allowable error range, a space coordinate is calculated according to the vertical position, the horizontal position and the basic model, and the space coordinate is sent to the vertical positioning robot and the horizontal positioning robot, the working reliability of the robot is improved.

Description

Robot control method, robot control device, computer equipment and storage medium
Technical Field
The present application relates to the field of robotics, and in particular, to a robot control method, apparatus, computer device, storage medium, and computer program product.
Background
With the development of the robot technology, the cooperative robot plays an irreplaceable role in the fields of industrial 3C, automotive electronics, daily necessities, biological safety and the like, improves the product quality and the production efficiency, and can ensure the life safety of people. The robot and the vision are cooperated, so that the human can be assisted to complete mechanical work, and the efficiency and the vision accuracy of the robot are fully exerted. The method has the outstanding advantages of high efficiency and high precision, and can greatly promote the rapid development of the manufacturing industry and the service industry.
The traditional robot vision positioning method is realized by a camera arranged on a mechanical arm, the camera collects images of a product to be detected, the robot is helped to position the product to be detected, and after the positioning, the robot can execute tasks on the product. However, due to the angle of view and other reasons, the method for visual positioning cannot determine errors caused by factors such as product inclination, so that positioning is not accurate enough, and the working reliability of the robot is reduced.
Disclosure of Invention
In view of the above, it is necessary to provide a robot control method, apparatus, computer device, storage medium, and computer program product capable of improving the operational reliability of a robot in view of the above technical problems.
In a first aspect, the present application provides a robot control method for controlling a robot, where the robot includes a vertical positioning robot and a horizontal positioning robot, the vertical positioning robot and the horizontal positioning robot are both provided with image acquisition devices for performing visual positioning on a product to be calibrated, and the control method includes:
calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system;
after a product to be calibrated is positioned through the image acquisition device on the vertical positioning robot, the vertical position of the image acquisition device on the vertical positioning robot is acquired, and after the product to be calibrated is positioned through the image acquisition device on the horizontal positioning robot, the horizontal position of the image acquisition device on the horizontal positioning robot is acquired;
calculating a first error between the vertical position and a basic model and a second error between the horizontal position and the basic model;
when the first error and the second error are both within an allowable error range, calculating space coordinates according to the vertical position, the horizontal position and the basic model;
and sending the space coordinate to the vertical positioning robot and the horizontal positioning robot, so that the vertical positioning robot and the horizontal positioning robot work based on the space coordinate.
In one embodiment, after calculating a first error between the vertical position and a base model and a second error between the horizontal position and the base model, the method further comprises:
when the first error or the second error is not within the allowable error range, sending an adjusting instruction to each robot, returning to the vertical positioning robot to position the product to be calibrated through the image acquisition device on the vertical positioning robot, acquiring the vertical position of the image acquisition device on the vertical positioning robot, and acquiring the horizontal position of the image acquisition device on the horizontal positioning robot after positioning the product to be calibrated through the image acquisition device on the horizontal positioning robot; the adjusting instruction is used for controlling each image acquisition device to adjust the posture.
In one embodiment, the adjusting instruction is further used for controlling the image acquisition device on the vertical positioning robot and the image acquisition device on the horizontal positioning robot to keep the space 90 degrees vertical.
In one embodiment, before calculating the first error between the vertical position and the base model and the second error between the horizontal position and the base model, the method further comprises:
and acquiring a basic model through the image acquisition device, the vertical positioning robot and the horizontal positioning robot.
In one embodiment, the basic model includes a vertical basic model and a horizontal basic model, and the obtaining of the basic model by the image capturing device, the vertical positioning robot and the horizontal positioning robot includes:
sending a first calibration instruction to the vertical positioning robot; the first calibration instruction is used for controlling an image acquisition device on the vertical positioning robot to determine a vertical basic model according to an initial vertical space direction posture;
sending a second calibration instruction to the horizontal positioning robot; and the second calibration instruction is used for controlling an image acquisition device on the horizontal positioning robot to determine a horizontal basic model according to the initial horizontal space direction posture.
In one embodiment, before calibrating the vertical positioning robot, the horizontal positioning robot, and the image capturing device to be under the same world coordinate system, the method further includes:
and distributing the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same network segment.
In a second aspect, the present application further provides a robot control apparatus for controlling a robot, the robot includes a vertical positioning robot and a horizontal positioning robot, the vertical positioning robot and the horizontal positioning robot are all provided with an image acquisition device thereon, for performing visual positioning on a product to be calibrated, the apparatus includes:
the coordinate unification module is used for calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system;
the position acquisition module is used for acquiring the vertical position of the image acquisition device on the vertical positioning robot after positioning a product to be calibrated through the image acquisition device on the vertical positioning robot, and acquiring the horizontal position of the image acquisition device on the horizontal positioning robot after positioning the product to be calibrated through the image acquisition device on the horizontal positioning robot;
the error calculation module is used for calculating a first error between the vertical position and a basic model and a second error between the horizontal position and the basic model;
the control coordinate calculation module is used for calculating space coordinates according to the vertical position, the horizontal position and the basic model when the first error and the second error are both within an allowable error range;
and the execution module is used for sending the space coordinates to the vertical positioning robot and the horizontal positioning robot so as to enable the vertical positioning robot and the horizontal positioning robot to work based on the space coordinates.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system;
after a product to be calibrated is positioned through the image acquisition device on the vertical positioning robot, the vertical position of the image acquisition device on the vertical positioning robot is acquired, and after the product to be calibrated is positioned through the image acquisition device on the horizontal positioning robot, the horizontal position of the image acquisition device on the horizontal positioning robot is acquired;
calculating a first error between the vertical position and a basic model and a second error between the horizontal position and the basic model;
when the first error and the second error are both within an allowable error range, calculating space coordinates according to the vertical position, the horizontal position and the basic model;
and sending the space coordinate to the vertical positioning robot and the horizontal positioning robot, so that the vertical positioning robot and the horizontal positioning robot work based on the space coordinate.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system;
after a product to be calibrated is positioned through the image acquisition device on the vertical positioning robot, the vertical position of the image acquisition device on the vertical positioning robot is acquired, and after the product to be calibrated is positioned through the image acquisition device on the horizontal positioning robot, the horizontal position of the image acquisition device on the horizontal positioning robot is acquired;
calculating a first error between the vertical position and a basic model and a second error between the horizontal position and the basic model;
when the first error and the second error are both within an allowable error range, calculating space coordinates according to the vertical position, the horizontal position and the basic model;
and sending the space coordinate to the vertical positioning robot and the horizontal positioning robot, so that the vertical positioning robot and the horizontal positioning robot work based on the space coordinate.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system;
after a product to be calibrated is positioned through the image acquisition device on the vertical positioning robot, the vertical position of the image acquisition device on the vertical positioning robot is acquired, and after the product to be calibrated is positioned through the image acquisition device on the horizontal positioning robot, the horizontal position of the image acquisition device on the horizontal positioning robot is acquired;
calculating a first error between the vertical position and a basic model and a second error between the horizontal position and the basic model;
when the first error and the second error are both within an allowable error range, calculating space coordinates according to the vertical position, the horizontal position and the basic model;
and sending the space coordinate to the vertical positioning robot and the horizontal positioning robot, so that the vertical positioning robot and the horizontal positioning robot work based on the space coordinate.
The robot control method, the robot control device, the computer equipment, the storage medium and the computer program product are used for controlling the robot, the robot comprises a vertical positioning robot and a horizontal positioning robot, the vertical positioning robot and the horizontal positioning robot are both provided with image acquisition devices for performing visual positioning on a product to be calibrated, and the control method comprises the following steps: the vertical positioning robot, the horizontal positioning robot and the image acquisition device are calibrated under the same world coordinate system, after the product to be calibrated is positioned by the image acquisition device on the vertical positioning robot, the vertical position of the image acquisition device on the vertical positioning robot is acquired, after a product to be calibrated is positioned by the image acquisition device on the horizontal positioning robot, the horizontal position of the image acquisition device on the horizontal positioning robot is acquired, a first error between the vertical position and the basic model and a second error between the horizontal position and the basic model are calculated, when the first error and the second error are both within an allowable error range, and calculating space coordinates according to the vertical position, the horizontal position and the basic model, and sending the space coordinates to the vertical positioning robot and the horizontal positioning robot so that the vertical positioning robot and the horizontal positioning robot work based on the space coordinates. The vertical positioning robot and the horizontal positioning robot perform visual mutual calibration through the image acquisition device, and the high-precision posture of a product to be calibrated in a space range is calculated in the vertical direction and the horizontal direction of the space by adopting a method of error gradient reduction, so that the visual positioning precision is improved, and the working reliability of the robot is improved.
Drawings
FIG. 1 is a flow diagram of a method for controlling a robot in one embodiment;
FIG. 2 is a schematic diagram of the operation of the robot in one embodiment;
FIG. 3 is a flowchart of a robot control method in another embodiment;
FIG. 4 is a detailed flow diagram of a method for controlling a robot in one embodiment;
FIG. 5 is a block diagram showing the construction of a robot control device according to an embodiment;
FIG. 6 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The robot control method provided by the embodiment of the application is used for controlling the robots, the number of the robots is more than two, and the robot control method comprises a vertical positioning robot and a horizontal positioning robot. The number of the vertical positioning robots and the number of the horizontal positioning robots can be more than two, and the vertical positioning robots and the horizontal positioning robots are both provided with image acquisition devices for carrying out visual positioning on a product to be calibrated. Specifically, the image capturing device may be disposed on a mechanical arm of the vertical positioning robot and a mechanical arm of the horizontal positioning robot, and the mechanical arm drives the image capturing device to move. The image capturing device may be a camera, such as a CCD camera, or other types of devices, such as an infrared camera.
In one embodiment, as shown in fig. 1, a robot control method is provided for controlling a robot, where the robot includes a vertical positioning robot and a horizontal positioning robot, and both the vertical positioning robot and the horizontal positioning robot are provided with an image acquisition device for performing visual positioning on a product to be calibrated. The robot is generally a cooperative robot, and may be, for example, a transfer robot, a welding robot, a mold processing robot, an industrial screw machine, or the like. The control method can be executed by a controller arranged in one of the robots, and the robots can communicate with each other. Alternatively, the robot control method may be executed by a server, and data transmission may be performed between the server and each robot. The server may be implemented as a stand-alone server or as a server cluster consisting of a plurality of servers. The control method comprises the following steps:
step 202, calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system.
Wherein, because the camera can be placed at any position in the environment, a reference coordinate system is selected in the environment to describe the position of the camera and to describe the position of any object in the environment, and the coordinate system is called a world coordinate system. Because each robot has an independent world coordinate system and each image acquisition device has an independent pixel coordinate system, the vertical positioning robot, the horizontal positioning robot and the image acquisition device are calibrated under the same world coordinate system, and calculation is convenient. The calibration can enable the coordinate systems of the vertical positioning robot and the horizontal positioning robot and the pixel coordinate system of the image acquisition device to be unified into a world coordinate system, so that the subsequent operation is facilitated.
And 204, after positioning the product to be calibrated through the image acquisition device on the vertical positioning robot, acquiring the vertical position of the image acquisition device on the vertical positioning robot, and after positioning the product to be calibrated through the image acquisition device on the horizontal positioning robot, acquiring the horizontal position of the image acquisition device on the horizontal positioning robot.
As shown in fig. 2, the vertical positioning robot may position the product to be calibrated in the height direction through the image capturing device disposed thereon. Taking the example that the image acquisition device is arranged on a mechanical arm of the vertical positioning robot, the movement direction of the mechanical arm is the height direction of the product to be calibrated, and the image acquisition device is driven to move in the height direction of the product to be calibrated to position the product to be calibrated. The product to be calibrated is positioned in a proper visual field range of the image acquisition device, so that the image of the product to be calibrated is clear. After a product to be calibrated is positioned, the vertical position of the image acquisition device on the vertical positioning robot is obtained, and the vertical position can comprise a height coordinate, an angle and the like.
Similarly, the horizontal positioning robot can position the product to be calibrated in the horizontal direction through the image acquisition device arranged on the horizontal positioning robot. Taking the example that the image acquisition device is arranged on a mechanical arm of a horizontal positioning robot, the movement direction of the mechanical arm is the horizontal direction of a product to be calibrated, and the image acquisition device is driven to move in the horizontal direction of the product to be calibrated so as to position the product to be calibrated. Further, the horizontal direction may include a first direction and a second direction, i.e., an x direction and a y direction, which are perpendicular. The position in the horizontal direction can be obtained through the first direction and the second direction, and horizontal positioning is achieved. After a product to be calibrated is positioned, the horizontal position of the image acquisition device on the horizontal positioning robot is acquired, and the horizontal position can comprise a horizontal coordinate, an angle and the like.
In step 206, a first error between the vertical position and the base model and a second error between the horizontal position and the base model are calculated.
The vertical position can be understood as the vertical position of the moving product to be calibrated after the moving product to be calibrated is positioned by the image acquisition device, the horizontal position can be understood as the horizontal position of the moving product to be calibrated after the moving product to be calibrated is moved by the image acquisition device, and the movement of the product to be calibrated can be overturning or displacement and the like. The basic model is used for indicating the initial position of the image acquisition device when calibrating the product to be calibrated, and may include the initial position of the product to be calibrated and the initial positions of the image acquisition devices. The basic model can be determined through a process of artificial teaching, for example, the moving position of the robot is manually controlled, and the initial vertical space and the horizontal space are determined to obtain the basic model.
The first error is a deviation value between the vertical position and the vertical position in the base model, and the first error can represent a deviation value between the vertical position after the product to be calibrated is repositioned by the image acquisition device and the initial vertical position after the product to be calibrated moves (such as displacement and overturning) in the vertical direction. Similarly, the second error is a deviation between the horizontal position and the horizontal position in the base model, and the second error may represent a deviation between the horizontal position and the initial horizontal position after the product to be calibrated is repositioned by the image capturing device after the product to be calibrated moves (e.g., shifts and flips) in the horizontal direction. The movement direction and the movement degree of the image acquisition device can be obtained through the first error and the second error, so that the movement direction and the movement degree of a product to be calibrated are represented.
And step 208, when the first error and the second error are both within the allowable error range, calculating the space coordinate according to the vertical position, the horizontal position and the basic model.
When the first error and the second error are both within the allowable error range, considering that the motion amplitude of the image acquisition device is smaller, the motion amplitude of the product to be calibrated is considered to be smaller, and at the moment, the product to be calibrated is considered to be within the target range. The space coordinate is the space coordinate of the product to be calibrated and is used for representing the space position of the product to be calibrated. The current position of the image acquisition device can be obtained according to the vertical position and the horizontal position, and the position of the image acquisition device, the position of a product to be calibrated and the corresponding relation between the image acquisition device and the product to be calibrated can be obtained by combining the basic model, so that the space coordinate of the product to be calibrated can be calculated.
And step 210, sending the space coordinates to the vertical positioning robot and the horizontal positioning robot, so that the vertical positioning robot and the horizontal positioning robot work based on the space coordinates.
After the space coordinates are obtained, the space coordinates are sent to the vertical positioning robot and the horizontal positioning robot, so that the vertical positioning robot and the horizontal positioning robot can execute actions on products to be calibrated according to the space coordinates, and corresponding work is completed.
The robot control method is used for controlling the robot, the robot comprises a vertical positioning robot and a horizontal positioning robot, the vertical positioning robot and the horizontal positioning robot are respectively provided with an image acquisition device for carrying out visual positioning on a product to be calibrated, and the control method comprises the following steps: the vertical positioning robot, the horizontal positioning robot and the image acquisition device are calibrated under the same world coordinate system, after the product to be calibrated is positioned by the image acquisition device on the vertical positioning robot, the vertical position of the image acquisition device on the vertical positioning robot is acquired, after a product to be calibrated is positioned by the image acquisition device on the horizontal positioning robot, the horizontal position of the image acquisition device on the horizontal positioning robot is acquired, a first error between the vertical position and the basic model and a second error between the horizontal position and the basic model are calculated, when the first error and the second error are both within an allowable error range, and calculating space coordinates according to the vertical position, the horizontal position and the basic model, and sending the space coordinates to the vertical positioning robot and the horizontal positioning robot so that the vertical positioning robot and the horizontal positioning robot work based on the space coordinates. The vertical positioning robot and the horizontal positioning robot perform visual mutual calibration through the image acquisition device, and the high-precision posture of a product to be calibrated in a space range is calculated in the vertical direction and the horizontal direction of the space by adopting a method of error gradient reduction, so that the visual positioning precision is improved, and the working reliability of the robot is improved.
In one embodiment, as shown in fig. 3, after step 206, the robot control method further comprises step 207.
And step 207, when the first error or the second error is not within the allowable error range, sending an adjusting instruction to each robot, and returning to the step 204.
The adjusting instruction is used for controlling each image acquisition device to adjust the posture. After receiving the adjustment instruction, each robot drives the image acquisition device to move, so as to adjust the posture of the image acquisition device arranged on the robot, for example, adjust the photographing angle or adjust the distance between the image acquisition device and the product to be calibrated, and the like, so as to position the product to be calibrated again. And after the image acquisition device is controlled to adjust the posture, returning to the step 204, repositioning the product to be calibrated, comparing the product with the basic model, iterating until the first error and the second error are within the allowable error range, calculating the space coordinate, and sending the space coordinate to the vertical positioning robot and the horizontal positioning robot to enable the robot to complete the task.
Furthermore, the adjusting instruction is also used for controlling the image acquisition device on the vertical positioning robot and the image acquisition device on the horizontal positioning robot to be always kept vertical at 90 degrees in space. Because the image acquisition device is generally a plane camera, X, Y direction parameters and 3 angle direction parameters can be provided, and when the image acquisition device on the vertical positioning robot and the image acquisition device on the horizontal positioning robot are always kept at 90-degree vertical space, coordinates of 6 degrees of freedom in space of the vertical positioning robot and the horizontal positioning robot can be provided, so that deviation correction is realized, and the visual positioning precision is further improved.
In one embodiment, as shown in fig. 3, prior to step 206, the robot control method further comprises step 205.
And step 205, acquiring a basic model through the image acquisition device, the vertical positioning robot and the horizontal positioning robot.
The basic model is used for indicating the initial position of the image acquisition device when calibrating the product to be calibrated, and may include the initial position of the product to be calibrated and the initial position of each image acquisition device. The basic model can be obtained through the image acquisition device, the vertical positioning robot and the horizontal positioning robot, so that the positioning of a product to be calibrated at each time can be more practical, and the visual positioning precision is improved. The basic model can be determined through a process of artificial teaching, for example, the movement positions of the vertical positioning robot and the horizontal positioning robot are manually controlled, and an initial vertical space and a horizontal space are determined to obtain the basic model. It is understood that in other embodiments, the base model may also be an initial value that has been set before the method is performed, and need not be recalibrated each time, so as to save the work flow.
In one embodiment, as shown in FIG. 3, step 205 includes step 302 and step 304.
Step 302, a first calibration instruction is sent to the vertical positioning robot.
The first calibration instruction is used for controlling an image acquisition device on the vertical positioning robot to determine the vertical base model according to the initial vertical space direction posture. After receiving the first calibration instruction, the image acquisition device on the vertical positioning robot determines a vertical basic model according to the initial vertical spatial orientation (such as the RZ direction). The vertical base model may include the vertical position of the product to be calibrated and the initial position of the image capture device.
And step 304, sending a second calibration instruction to the horizontal positioning robot.
And the second calibration instruction is used for controlling an image acquisition device on the horizontal positioning robot to determine the horizontal basic model according to the initial horizontal space direction posture. And after receiving the second calibration instruction, the image acquisition device on the horizontal positioning robot determines a horizontal basic model according to the initial horizontal space direction postures (such as the RX and RY directions). The horizontal base model may comprise a horizontal position of the product to be calibrated and a horizontal position of the image acquisition device. The basic model can be obtained through the vertical positioning robot and the horizontal positioning robot, so that the obtained basic model is more comprehensive and accurate.
In one embodiment, as shown in fig. 3, the robot control method further comprises step 201 before step 202.
Step 201, distributing the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same network segment.
The same network segment refers to the same address of the network segment, and the subnet mask is used for cutting the network address of the address and the host address, but in reverse, the address subnet mask of the same network segment is necessarily the same, and an IP address segment is allocated to each network segment, which is the same network segment. The vertical positioning robot, the horizontal positioning robot and the image acquisition device are distributed to the same network segment, so that the vertical positioning robot and the horizontal positioning robot can be communicated with each other directly, and the communication efficiency is improved. It is understood that in other embodiments, the vertical positioning robot, the horizontal positioning robot and the image capturing device may be indirectly communicated through auxiliary equipment, as long as the skilled person in the art can realize the communication.
For a better understanding of the above embodiments, the following detailed description is given in conjunction with a specific embodiment. In an embodiment, taking robots as all the cooperative robots, the number of the cooperative robots and the number of the image capturing devices as two, and the image capturing devices as cameras as an example, please refer to fig. 4, the robot control method includes:
when the cooperative robots and the cameras are started, the IP of the cooperative robots and the IP of the cameras are configured in the same network segment, and the cooperative robots and the cameras can directly communicate with each other by being configured in the same network segment, so that the communication efficiency is improved. The camera 1 is mounted on the end of the cooperative robot 1 and the camera 2 is mounted on the end of the cooperative robot 2.
Next, the cooperative robots 1 and 2 and the cameras 1 and 2 are calibrated to the same world coordinate system. Because each cooperative robot has an independent world coordinate system and each camera has an independent pixel coordinate system, all the cooperative robots and the cameras need to be unified into one world coordinate system, the calculation is convenient, and the calibration function is to unify the own coordinate systems of the two cooperative robots and the pixel coordinate systems of the cameras into one world coordinate system.
Meanwhile, the camera 1 determines a calibration product base model in an initial vertical spatial orientation (RZ direction), and the camera 2 determines a calibration product base model (two models of RX and RY directions) in an initial horizontal spatial orientation (RX and RY directions), as illustrated. The initial vertical space and the initial horizontal space are determined through a process of artificial teaching, namely, manually controlling the movement position of the cooperative robot to determine the initial vertical space and the initial horizontal space.
When the product deviates by a certain space angle, the cooperative robot 1 drives the camera 1 to position the product in the RZ direction and then compares the product with the basic model, the cooperative robot 2 drives the camera 2 to position the product in the RX and RY directions and then compares the product with the basic model, and if the RZ, RX and RY direction models are all in the error range, the X, Y, Z, RX, RY and RZ coordinates can be calculated by using the data of the camera 1 and the camera 2 and sent to an executing mechanism to complete tasks. The general cooperative robot has 6 joints, namely 6 degrees of freedom, and X, Y, Z, RX, RY and RZ respectively correspond to the 6 degrees of freedom; x, Y, Z differ from RX, RY, RZ in the direction of movement in the world coordinate system.
On the contrary, if the model of the directions RZ, RX and RY is not in the error range, the posture of the cooperative robot 1 is adjusted, and the cooperative robot 2 makes corresponding posture adjustment accordingly, so that the relative position of the camera 1 and the camera 2 is kept unchanged by 90 degrees. The camera 1 and the camera 2 always keep a 90-degree vertical space function: the camera is a plane camera and can provide X, Y direction parameters and 3 angles, and the two cameras are combined at 90 degrees in space, so that coordinates of 6 degrees of freedom in a cooperative robot space can be provided to realize deviation correction.
After the positions of the camera 1 and the camera 2 are adjusted, the camera 1 is positioned in the RZ direction again and compared with the basic model, the camera 2 is compared with the basic model in the RX and RY directions, and iteration is carried out until the positioning products in the RX, RY and RZ directions and the basic model are all in the error range, and then the X, Y, Z, RX, RY and RZ coordinates can be sent to an execution mechanism to complete the task.
The scheme determines whether the cooperative robot 1 (namely an actuating mechanism) reaches an actuating position (namely a space 6-degree-of-freedom position of a product) according to the comparison of the real-time position of the camera and the basic position. Since the pixels of the camera in the previous stage have reached the hundred million level, after calibration, the pixel equivalent of the camera (one pixel represents an actual physical quantity, for example, the actual physical distance between two points is 10mm, the pixel distance is 100 pixels, and the pixel equivalent is 10/100 mm/pixel) is calculated, and when the actual physical distance is smaller, the pixels of the camera are large enough, the pixel equivalent is smaller, and micron level positioning can be achieved (for example, the actual range of the camera to take a picture is 60mm 40mm, the pixel resolution of the camera is 6000pixel 4000pixel, the pixel equivalent is 0.01mm/pixel, and when the real-time position of the camera is within one pixel of the error of the basic model, the micron level positioning is achieved).
According to the robot control method, the plane vision is installed at the tail end of the cooperative robot, the interactive iteration of the double cameras is used through the double-arm cooperative robot and the plane high-precision camera, the high-precision posture of the workpiece in the space range is calibrated by adopting an error gradient descending method, and the space position (X, Y, Z) and the space posture angle (RX, RY, RZ) of the workpiece in the space range of the mechanical arm are obtained. The multiple cooperative robots are mutually calibrated through vision, and iterate for multiple times in the vertical direction and the horizontal direction of the space, so that the high-precision 6D posture of the product is calculated, the spatial freedom degree of the cooperative robots can be ensured, and the precision of the plane vision is also ensured. The multi-machine cooperative robot and vision can finish space three-dimensional product high-precision grabbing together, the millimeter-level positioning precision of the conventional cooperative robot matched with a 3D camera is improved to 10 micrometers, and the method is used in the fields of semiconductor packaging, high-precision chip assembly and the like.
It should be understood that, although the steps in the flowcharts related to the embodiments are shown in sequence as indicated by the arrows, the steps are not necessarily executed in sequence as indicated by the arrows. The steps are not limited to being performed in the exact order illustrated and, unless explicitly stated herein, may be performed in other orders. Moreover, at least a part of the steps in the flowcharts related to the above embodiments may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application also provides a robot control device for realizing the robot control method. The solution of the problem provided by the apparatus is similar to the solution described in the above method, so the specific limitations in one or more embodiments of the robot control apparatus provided below can be referred to the limitations of the robot control method in the above, and are not described herein again.
In one embodiment, as shown in fig. 5, there is provided a robot control apparatus including a coordinate unification module 102, a position collection module 104, an error calculation module 106, a control coordinate calculation module 108, and an execution module 110, wherein:
the coordinate unification module 102 is used for calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system;
the position acquisition module 104 is used for acquiring the vertical position of the image acquisition device on the vertical positioning robot after positioning the product to be calibrated through the image acquisition device on the vertical positioning robot, and acquiring the horizontal position of the image acquisition device on the horizontal positioning robot after positioning the product to be calibrated through the image acquisition device on the horizontal positioning robot;
an error calculation module 106, configured to calculate a first error between the vertical position and the base model and a second error between the horizontal position and the base model;
a control coordinate calculation module 108, configured to calculate spatial coordinates according to the vertical position, the horizontal position, and the base model when the first error and the second error are both within an allowable error range;
and the execution module 110 is configured to send the spatial coordinates to the vertical positioning robot and the horizontal positioning robot, so that the vertical positioning robot and the horizontal positioning robot work based on the spatial coordinates.
In one embodiment, the robot controller further includes an adjusting module, and the adjusting module is configured to send an adjusting instruction to each robot after the error calculating module 106 calculates a first error between the vertical position and the base model and a second error between the horizontal position and the base model, and return to the position acquiring module 104 to acquire the vertical position of the image acquiring device on the vertical positioning robot after the product to be calibrated is positioned by the image acquiring device on the vertical positioning robot, and acquire the horizontal position of the image acquiring device on the horizontal positioning robot after the product to be calibrated is positioned by the image acquiring device on the horizontal positioning robot.
In one embodiment, the adjustment instructions are further used for controlling the image acquisition device on the vertical positioning robot and the image acquisition device on the horizontal positioning robot to keep the space 90 degrees vertical.
In one embodiment, the robot controller further comprises a base model obtaining module for obtaining the base model through the image capturing device, the vertical positioning robot and the horizontal positioning robot before the error calculating module 106 calculates the first error of the vertical position from the base model and the second error of the horizontal position from the base model.
In one embodiment, the basic model obtaining module is further configured to send a first calibration instruction to the vertical positioning robot and send a second calibration instruction to the horizontal positioning robot. The first calibration instruction is used for controlling an image acquisition device on the vertical positioning robot to determine a vertical basic model according to the initial vertical space direction posture; and the second calibration instruction is used for controlling an image acquisition device on the horizontal positioning robot to determine the horizontal basic model according to the initial horizontal space direction posture.
In one embodiment, the robot controller further comprises a network segment allocating module, which is configured to allocate the vertical positioning robot, the horizontal positioning robot, and the image capturing device to the same network segment before the coordinate unifying module 102 calibrates the vertical positioning robot, the horizontal positioning robot, and the image capturing device to the same world coordinate system.
The respective modules in the robot control device described above may be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operating system and the computer program to run on the non-volatile storage medium. The database of the computer device is used for storing data such as basic positions, horizontal positions, basic models, space coordinates and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a robot control method.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the above-described method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In an embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include a Read-Only Memory (ROM), a magnetic tape, a floppy disk, a flash Memory, an optical Memory, a high-density embedded nonvolatile Memory, a resistive Random Access Memory (ReRAM), a Magnetic Random Access Memory (MRAM), a Ferroelectric Random Access Memory (FRAM), a Phase Change Memory (PCM), a graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application should be subject to the appended claims.

Claims (10)

1. A robot control method is characterized in that the method is used for controlling a robot, the robot comprises a vertical positioning robot and a horizontal positioning robot, image acquisition devices are arranged on the vertical positioning robot and the horizontal positioning robot respectively and used for carrying out visual positioning on a product to be calibrated, and the control method comprises the following steps:
calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system;
after a product to be calibrated is positioned through the image acquisition device on the vertical positioning robot, the vertical position of the image acquisition device on the vertical positioning robot is acquired, and after the product to be calibrated is positioned through the image acquisition device on the horizontal positioning robot, the horizontal position of the image acquisition device on the horizontal positioning robot is acquired;
calculating a first error between the vertical position and a basic model and a second error between the horizontal position and the basic model;
when the first error and the second error are both within an allowable error range, calculating space coordinates according to the vertical position, the horizontal position and the basic model;
and sending the space coordinate to the vertical positioning robot and the horizontal positioning robot, so that the vertical positioning robot and the horizontal positioning robot work based on the space coordinate.
2. The method of claim 1, wherein after calculating the first error between the vertical position and the base model and the second error between the horizontal position and the base model, further comprising:
when the first error or the second error is not within the allowable error range, sending an adjusting instruction to each robot, returning to the vertical positioning robot to position the product to be calibrated through the image acquisition device on the vertical positioning robot, acquiring the vertical position of the image acquisition device on the vertical positioning robot, and acquiring the horizontal position of the image acquisition device on the horizontal positioning robot after positioning the product to be calibrated through the image acquisition device on the horizontal positioning robot; the adjusting instruction is used for controlling each image acquisition device to adjust the posture.
3. The method of claim 2, wherein the adjustment instructions are further configured to control an image capturing device on the vertical positioning robot and an image capturing device on the horizontal positioning robot to maintain a 90 ° vertical spatial orientation.
4. The method of claim 1, wherein prior to calculating the first error between the vertical position and the base model and the second error between the horizontal position and the base model, further comprising:
and acquiring a basic model through the image acquisition device, the vertical positioning robot and the horizontal positioning robot.
5. The method of claim 4, wherein the base model comprises a vertical base model and a horizontal base model, and wherein obtaining the base model by the image capture device, the vertical positioning robot, and the horizontal positioning robot comprises:
sending a first calibration instruction to the vertical positioning robot; the first calibration instruction is used for controlling an image acquisition device on the vertical positioning robot to determine a vertical basic model according to an initial vertical space direction posture;
sending a second calibration instruction to the horizontal positioning robot; and the second calibration instruction is used for controlling an image acquisition device on the horizontal positioning robot to determine a horizontal basic model according to the initial horizontal space direction posture.
6. The method of claim 1, wherein before calibrating the vertical positioning robot, the horizontal positioning robot, and the image acquisition device to be under the same world coordinate system, further comprising:
and distributing the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same network segment.
7. The utility model provides a robot control device which characterized in that for control the robot, the robot includes vertical positioning robot and horizontal positioning robot, vertical positioning robot with all be provided with image acquisition device on the horizontal positioning robot for treat the product of maring and carry out vision positioning, the device includes:
the coordinate unification module is used for calibrating the vertical positioning robot, the horizontal positioning robot and the image acquisition device to the same world coordinate system;
the position acquisition module is used for acquiring the vertical position of the image acquisition device on the vertical positioning robot after positioning a product to be calibrated through the image acquisition device on the vertical positioning robot, and acquiring the horizontal position of the image acquisition device on the horizontal positioning robot after positioning the product to be calibrated through the image acquisition device on the horizontal positioning robot;
the error calculation module is used for calculating a first error between the vertical position and a basic model and a second error between the horizontal position and the basic model;
the control coordinate calculation module is used for calculating space coordinates according to the vertical position, the horizontal position and the basic model when the first error and the second error are both within an allowable error range;
and the execution module is used for sending the space coordinates to the vertical positioning robot and the horizontal positioning robot so as to enable the vertical positioning robot and the horizontal positioning robot to work based on the space coordinates.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 6.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 6 when executed by a processor.
CN202210527441.1A 2022-05-16 2022-05-16 Robot control method, apparatus, computer device, and storage medium Active CN114750160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210527441.1A CN114750160B (en) 2022-05-16 2022-05-16 Robot control method, apparatus, computer device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210527441.1A CN114750160B (en) 2022-05-16 2022-05-16 Robot control method, apparatus, computer device, and storage medium

Publications (2)

Publication Number Publication Date
CN114750160A true CN114750160A (en) 2022-07-15
CN114750160B CN114750160B (en) 2023-05-23

Family

ID=82335711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210527441.1A Active CN114750160B (en) 2022-05-16 2022-05-16 Robot control method, apparatus, computer device, and storage medium

Country Status (1)

Country Link
CN (1) CN114750160B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115533908A (en) * 2022-10-11 2022-12-30 江苏高倍智能装备有限公司 Alignment control method and system for multi-manipulator matched workpiece lifting

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS648408A (en) * 1987-07-01 1989-01-12 Hitachi Ltd Method for matching coordinate system
US20050273199A1 (en) * 2004-06-02 2005-12-08 Fanuc Ltd. Robot system
DE102006004153A1 (en) * 2006-01-27 2007-08-02 Vision Tools Hard- Und Software Entwicklungs-Gmbh Method of obtaining the relation between coordinate systems of two robots uses imaging cameras on each robot directed at three marks for different robot positions
WO2014161603A1 (en) * 2013-04-05 2014-10-09 Abb Technology Ltd A robot system and method for calibration
CN106272444A (en) * 2016-08-31 2017-01-04 山东中清智能科技有限公司 A kind of realize trick relation and method that dual robot relation is demarcated simultaneously
CN107995885A (en) * 2016-11-18 2018-05-04 深圳配天智能技术研究院有限公司 A kind of coordinate system scaling method, system and device
JP2018094649A (en) * 2016-12-09 2018-06-21 ファナック株式会社 Robot system with plural robots, robot control device, and robot control method
CN108519055A (en) * 2018-04-26 2018-09-11 华中科技大学 A kind of dual robot relative pose online calibration method of view-based access control model
CN108687776A (en) * 2017-04-05 2018-10-23 大族激光科技产业集团股份有限公司 A kind of robot control system
US20190143523A1 (en) * 2017-11-16 2019-05-16 General Electric Company Robotic system architecture and control processes
CN112692828A (en) * 2020-12-18 2021-04-23 上海新时达机器人有限公司 Robot calibration method, system, device and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS648408A (en) * 1987-07-01 1989-01-12 Hitachi Ltd Method for matching coordinate system
US20050273199A1 (en) * 2004-06-02 2005-12-08 Fanuc Ltd. Robot system
DE102006004153A1 (en) * 2006-01-27 2007-08-02 Vision Tools Hard- Und Software Entwicklungs-Gmbh Method of obtaining the relation between coordinate systems of two robots uses imaging cameras on each robot directed at three marks for different robot positions
WO2014161603A1 (en) * 2013-04-05 2014-10-09 Abb Technology Ltd A robot system and method for calibration
CN106272444A (en) * 2016-08-31 2017-01-04 山东中清智能科技有限公司 A kind of realize trick relation and method that dual robot relation is demarcated simultaneously
CN107995885A (en) * 2016-11-18 2018-05-04 深圳配天智能技术研究院有限公司 A kind of coordinate system scaling method, system and device
JP2018094649A (en) * 2016-12-09 2018-06-21 ファナック株式会社 Robot system with plural robots, robot control device, and robot control method
CN108687776A (en) * 2017-04-05 2018-10-23 大族激光科技产业集团股份有限公司 A kind of robot control system
US20190143523A1 (en) * 2017-11-16 2019-05-16 General Electric Company Robotic system architecture and control processes
CN108519055A (en) * 2018-04-26 2018-09-11 华中科技大学 A kind of dual robot relative pose online calibration method of view-based access control model
CN112692828A (en) * 2020-12-18 2021-04-23 上海新时达机器人有限公司 Robot calibration method, system, device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
付贵;: "基于机器视觉的工业机器人标定方法研究" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115533908A (en) * 2022-10-11 2022-12-30 江苏高倍智能装备有限公司 Alignment control method and system for multi-manipulator matched workpiece lifting
CN115533908B (en) * 2022-10-11 2023-10-03 江苏高倍智能装备有限公司 Alignment control method and system for multi-manipulator matched workpiece lifting

Also Published As

Publication number Publication date
CN114750160B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
KR102532072B1 (en) System and method for automatic hand-eye calibration of vision system for robot motion
JP6591507B2 (en) Simultaneous kinematics and hand-eye calibration
CN109153125B (en) Method for orienting an industrial robot and industrial robot
US10331728B2 (en) System and method of robot calibration using image data
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
CN113362396B (en) Mobile robot 3D hand-eye calibration method and device
CN110253574B (en) Multi-task mechanical arm pose detection and error compensation method
JPH1137721A (en) Method for linearly estimating three-dimensional position by affine camera correction
WO2021169855A1 (en) Robot correction method and apparatus, computer device, and storage medium
JPWO2018043525A1 (en) Robot system, robot system control apparatus, and robot system control method
CN113379849A (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN112330752A (en) Multi-camera combined calibration method and device, terminal equipment and readable storage medium
WO2023083056A1 (en) Method and device for calibrating kinematic parameters of robot
CN114347013A (en) Method for assembling printed circuit board and FPC flexible cable and related equipment
CN114750160B (en) Robot control method, apparatus, computer device, and storage medium
JP2682763B2 (en) Automatic measurement method of operation error of robot body
US20240001557A1 (en) Robot and robot hand-eye calibrating method
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
CN114833825A (en) Cooperative robot control method and device, computer equipment and storage medium
WO2020010625A1 (en) Method and system for optimizing kinematic model of robot, and storage device.
Tsai et al. Overview of a unified calibration trio for robot eye, eye-to-hand, and hand calibration using 3D machine vision
CN115446836A (en) Visual servo method based on mixing of multiple image characteristic information
CN115567781A (en) Shooting method and device based on smart camera and computer equipment
JP7423387B2 (en) Calibration system, information processing system, robot control system, calibration method, information processing method, robot control method, calibration program, information processing program, calibration device, information processing device, and robot control device
CN109146979B (en) Method for compensating for deviation of mechanical arm from walking position

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant